0% found this document useful (0 votes)
34 views51 pages

Generative Artificial Intelligence Exploring The Power and Potential of Generative AI 1st Edition Shivam R Solanki Instant Download

The document is a promotional overview of the book 'Generative Artificial Intelligence: Exploring the Power and Potential of Generative AI' by Shivam R Solanki and Drupad K Khublani, detailing its contents and structure. It covers various aspects of generative AI, including its applications, ethical challenges, and technical foundations. Additionally, it provides links to other related eBooks and resources available for download.

Uploaded by

akhmarsbaihi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views51 pages

Generative Artificial Intelligence Exploring The Power and Potential of Generative AI 1st Edition Shivam R Solanki Instant Download

The document is a promotional overview of the book 'Generative Artificial Intelligence: Exploring the Power and Potential of Generative AI' by Shivam R Solanki and Drupad K Khublani, detailing its contents and structure. It covers various aspects of generative AI, including its applications, ethical challenges, and technical foundations. Additionally, it provides links to other related eBooks and resources available for download.

Uploaded by

akhmarsbaihi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

Generative Artificial Intelligence Exploring the

Power and Potential of Generative AI 1st Edition


Shivam R Solanki pdf download

https://ebookname.com/product/generative-artificial-intelligence-
exploring-the-power-and-potential-of-generative-ai-1st-edition-
shivam-r-solanki/

Get Instant Ebook Downloads – Browse at https://ebookname.com


Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Generative AI 1st Edition Martin Musiol

https://ebookname.com/product/generative-ai-1st-edition-martin-
musiol/

LLMs and Generative AI for Healthcare Kerrie Holley

https://ebookname.com/product/llms-and-generative-ai-for-
healthcare-kerrie-holley/

Empowering the Public Sector with Generative AI 1st


Edition Sanjeev Pulapaka

https://ebookname.com/product/empowering-the-public-sector-with-
generative-ai-1st-edition-sanjeev-pulapaka/

Okashi sweet treats made with love Ishida

https://ebookname.com/product/okashi-sweet-treats-made-with-love-
ishida/
Structural Elements Design Manual Working with
Eurocodes 1st Edition Trevor Draycott

https://ebookname.com/product/structural-elements-design-manual-
working-with-eurocodes-1st-edition-trevor-draycott/

Feedstock recycling of plastics Selected papers


presented at the third International Symposium on
Feedstock Recycling of Plastics Karlsruhe Sept 25 29
2005 1st edition Edition Matthias Müller-Hagedorn
https://ebookname.com/product/feedstock-recycling-of-plastics-
selected-papers-presented-at-the-third-international-symposium-
on-feedstock-recycling-of-plastics-karlsruhe-sept-25-29-2005-1st-
edition-edition-matthias-muller-hagedorn/

A Concise Companion to Shakespeare and the Text Concise


Companions to Literature and Culture 1st Edition Andrew
R. Murphy

https://ebookname.com/product/a-concise-companion-to-shakespeare-
and-the-text-concise-companions-to-literature-and-culture-1st-
edition-andrew-r-murphy/

Multifunctional Composites 1st Edition Ever J Barbero


(Ed.)

https://ebookname.com/product/multifunctional-composites-1st-
edition-ever-j-barbero-ed/

The Pain Free Program A Proven Method to Relieve Back


Neck Shoulder and Joint Pain 1st Edition Anthony B.
Carey

https://ebookname.com/product/the-pain-free-program-a-proven-
method-to-relieve-back-neck-shoulder-and-joint-pain-1st-edition-
anthony-b-carey/
Interest Rate Markets A Practical Approach to Fixed
Income Wiley Trading 1st Edition Siddhartha Jha

https://ebookname.com/product/interest-rate-markets-a-practical-
approach-to-fixed-income-wiley-trading-1st-edition-siddhartha-
jha/
Generative
Artif icial
Intelligence
Exploring the Power and Potential of
Generative AI

Shivam R Solanki
Drupad K Khublani
Generative Artificial
Intelligence
Exploring the Power and Potential
of Generative AI

Shivam R Solanki
Drupad K Khublani
Generative Artificial Intelligence: Exploring the Power and Potential of Generative AI
Shivam R Solanki Drupad K Khublani
Dallas, TX, USA Salt Lake City, UT, USA

ISBN-13 (pbk): 979-8-8688-0402-1 ISBN-13 (electronic): 979-8-8688-0403-8


https://doi.org/10.1007/979-8-8688-0403-8

Copyright © 2024 by Shivam R Solanki, Drupad K Khublani


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now
known or hereafter developed.
Trademarked names, logos, and images may appear in this book. Rather than use a trademark symbol with
every occurrence of a trademarked name, logo, or image we use the names, logos, and images only in an
editorial fashion and to the benefit of the trademark owner, with no intention of infringement of the
trademark.
The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not
identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to
proprietary rights.
While the advice and information in this book are believed to be true and accurate at the date of publication,
neither the authors nor the editors nor the publisher can accept any legal responsibility for any errors or
omissions that may be made. The publisher makes no warranty, express or implied, with respect to the
material contained herein.
Managing Director, Apress Media LLC: Welmoed Spahr
Acquisitions Editor: Celestin Suresh John
Development Editor: Laura Berendson
Coordinating Editor: Kripa Joseph
Copy Editor: Kim Wimpsett
Cover designed by eStudioCalamar
Cover image by kjpargeter on freepik (www.freepik.com)
Distributed to the book trade worldwide by Apress Media, LLC, 1 New York Plaza, New York, NY 10004,
U.S.A. Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail [email protected], or visit www.
springeronline.com. Apress Media, LLC is a California LLC and the sole member (owner) is Springer Science
+ Business Media Finance Inc (SSBM Finance Inc). SSBM Finance Inc is a Delaware corporation.
For information on translations, please e-mail [email protected]; for reprint,
paperback, or audio rights, please e-mail [email protected].
Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook versions and
licenses are also available for most titles. For more information, reference our Print and eBook Bulk Sales
web page at http://www.apress.com/bulk-sales.
Any source code or other supplementary material referenced by the author in this book is available to
readers on GitHub (https://github.com/Apress). For more detailed information, please visit
https://www.apress.com/gp/services/source-code.
If disposing of this product, please recycle the paper
To my mother, whose strength and love have guided me; to my wife, my
rock and inspiration; and to my family, who have always believed in
me. This book is a tribute to your unwavering support and belief in my
dreams, with all my love and gratitude.
Shivam R Solanki
To my beloved family—Mummy and Papa, whose unwavering faith
and love have been my guiding light; to Didi and Jiju, whose
encouragement never faltered; and to my wife, Suman, my inspiration
and support. This book stands as a testament to your belief and love,
dedicated with all my heart and gratitude.
Drupad K Khublani

Reality is unpredictable only as long as we see it without the lens of


statistics. Statistics’ potential to collapse reality to a handful of
possibilities is what drew us to this field. We want to pay our respects to
Alan Turing for initiating humanity’s endeavor toward training Turing
machines (what we call computers today), which paved the way for
artificial intelligence.
The authors
Table of Contents
About the Authors���������������������������������������������������������������������������������������������������� xi

About the Technical Reviewer������������������������������������������������������������������������������� xiii


Introduction�������������������������������������������������������������������������������������������������������������xv

Chapter 1: Introduction to Generative AI������������������������������������������������������������������ 1


Unveiling the Magic of Generative AI�������������������������������������������������������������������������������������������� 1
The Genesis of Generative AI�������������������������������������������������������������������������������������������������������� 2
Milestones Along the Way�������������������������������������������������������������������������������������������������������� 4
Fundamentals of Generative Models�������������������������������������������������������������������������������������������� 5
Neural Networks: The Backbone of Generative AI������������������������������������������������������������������� 6
Understanding the Difference: Generative vs. Discriminative Models������������������������������������� 8
Understanding the Core: Types and Techniques���������������������������������������������������������������������� 9
Diffusion Models�������������������������������������������������������������������������������������������������������������������� 10
Generative Adversarial Networks������������������������������������������������������������������������������������������ 10
Variational Autoencoders������������������������������������������������������������������������������������������������������� 11
Restricted Boltzmann Machines�������������������������������������������������������������������������������������������� 11
Pixel Recurrent Neural Networks������������������������������������������������������������������������������������������ 12
Generative Models in Society and Technology���������������������������������������������������������������������������� 13
Real-World Applications and Advantages of Generative AI���������������������������������������������������� 13
Ethical and Technical Challenges of Generative AI���������������������������������������������������������������� 15
Impact of Generative Models in Data Science����������������������������������������������������������������������� 18
The Diverse Domains of Generative AI���������������������������������������������������������������������������������������� 20
Visuals: From Pixel to Palette������������������������������������������������������������������������������������������������ 20
Audio: Symphonies of AI�������������������������������������������������������������������������������������������������������� 21
Text: Weaving Words into Worlds������������������������������������������������������������������������������������������� 22
The Future of Generative AI: A Symphony of Possibilities����������������������������������������������������� 22

v
Table of Contents

Setting Up the Development Environment���������������������������������������������������������������������������������� 23


Setting Up a Google Colab Environment�������������������������������������������������������������������������������� 23
Hugging Face Access and Token Key Generation������������������������������������������������������������������ 30
OpenAI Access Account and Token Key Generation��������������������������������������������������������������� 32
Troubleshooting Common Issues������������������������������������������������������������������������������������������� 33
Summary������������������������������������������������������������������������������������������������������������������������������������ 35

Chapter 2: Text-to-Image Generation��������������������������������������������������������������������� 37


Introduction��������������������������������������������������������������������������������������������������������������������������������� 37
Bridging the Gap Between Text and Image Data������������������������������������������������������������������������� 39
Understanding the Fundamentals of Image Data������������������������������������������������������������������ 40
Correlation Between Image and Text Data Using CLIP Model������������������������������������������������ 43
Diffusion Model��������������������������������������������������������������������������������������������������������������������� 49
Text-to-Image Generation����������������������������������������������������������������������������������������������������������� 67
Using a Pre-trained Model����������������������������������������������������������������������������������������������������� 68
Fine-Tuning Text-to-Image Models���������������������������������������������������������������������������������������� 71
Conclusion���������������������������������������������������������������������������������������������������������������������������������� 79

Chapter 3: From Script to Screen: Unveiling Text-to-Video Generation������������������ 81


Introduction��������������������������������������������������������������������������������������������������������������������������������� 81
Understanding Video Data����������������������������������������������������������������������������������������������������������� 84
Challenges in Working with Video Data��������������������������������������������������������������������������������� 87
The Synergy of Video and Textual Data��������������������������������������������������������������������������������� 91
Hands-On: Demonstrating a Pre-Trained Model�������������������������������������������������������������������������� 93
Step 1: Installing Libraries����������������������������������������������������������������������������������������������������� 94
Step 2: Model Inference�������������������������������������������������������������������������������������������������������� 95
Fine-Tuning for Custom Applications������������������������������������������������������������������������������������������ 96
Step 1: Installing Libraries����������������������������������������������������������������������������������������������������� 99
Step 2: Data Loading and Preprocessing����������������������������������������������������������������������������� 100
Step 3: Model Training (Fine-Tuning)����������������������������������������������������������������������������������� 103
Step 4: Model Inference������������������������������������������������������������������������������������������������������ 107
Conclusion�������������������������������������������������������������������������������������������������������������������������������� 111

vi
Table of Contents

Chapter 4: Bridging Text and Audio in Generative AI�������������������������������������������� 113


Brief History������������������������������������������������������������������������������������������������������������������������������ 113
Fundamentals and Challenges�������������������������������������������������������������������������������������������������� 115
Understanding Audio Data��������������������������������������������������������������������������������������������������� 115
Challenges in Working with Audio Data������������������������������������������������������������������������������� 118
Mitigating Challenges in Audio Data Processing����������������������������������������������������������������� 119
Bridging Text and Audio: The CLAP Model Implementation������������������������������������������������������� 120
Step 1: Installing Libraries and Data Loading���������������������������������������������������������������������� 122
Step 2: Model Inference������������������������������������������������������������������������������������������������������ 123
Understanding AI-Driven Text and Audio Conversion Models���������������������������������������������������� 125
Understanding CTC Architectures���������������������������������������������������������������������������������������� 125
Understanding Seq2Seq Architectures�������������������������������������������������������������������������������� 128
Implementation AI-Driven Text and Audio Conversion Modes��������������������������������������������������� 130
Speech to Text��������������������������������������������������������������������������������������������������������������������� 130
Text to Speech��������������������������������������������������������������������������������������������������������������������� 149
Conclusion�������������������������������������������������������������������������������������������������������������������������������� 170

Chapter 5: Large Language Models���������������������������������������������������������������������� 173


Introduction������������������������������������������������������������������������������������������������������������������������������� 173
Phases of Training and Adoption of Large Language Models���������������������������������������������� 175
Types of Language Transformers Models��������������������������������������������������������������������������������� 179
Encoder Models������������������������������������������������������������������������������������������������������������������� 183
Fine-Tuning BERT���������������������������������������������������������������������������������������������������������������� 188
Decoder-Only Models (Generative Pre-­trained Transformer)����������������������������������������������� 219
Encoder-Decoder Models���������������������������������������������������������������������������������������������������� 222
A Glimpse into the LLM Horizon: Where Do We Go from Here?������������������������������������������������� 226
Summary���������������������������������������������������������������������������������������������������������������������������������� 228

Chapter 6: Generative Large Language Models���������������������������������������������������� 229


Introduction������������������������������������������������������������������������������������������������������������������������������� 229
NLP Tasks Using LLMs�������������������������������������������������������������������������������������������������������������� 230
Sentiment Analysis�������������������������������������������������������������������������������������������������������������� 231

vii
Table of Contents

Entity Extraction������������������������������������������������������������������������������������������������������������������ 236


Topic Modeling�������������������������������������������������������������������������������������������������������������������� 239
Natural Language Generation Tasks Using LLMs���������������������������������������������������������������������� 241
Creative Writing������������������������������������������������������������������������������������������������������������������� 241
Text Summarization������������������������������������������������������������������������������������������������������������� 244
Dialogue Generation������������������������������������������������������������������������������������������������������������ 247
Advanced Prompting Techniques���������������������������������������������������������������������������������������������� 250
Few-Shot Prompting����������������������������������������������������������������������������������������������������������� 251
Chain-of-Thought���������������������������������������������������������������������������������������������������������������� 253
Prompting vs. Fine-Tuning��������������������������������������������������������������������������������������������������� 255
Fine-Tuning LLMs���������������������������������������������������������������������������������������������������������������������� 258
Case Study: Fine-Tuning an LLM for Sentiment Analysis���������������������������������������������������� 260
Parameter Efficient Fine-Tuning������������������������������������������������������������������������������������������ 261
Fine-Tuning LLM for Question Answering���������������������������������������������������������������������������� 263
Summary���������������������������������������������������������������������������������������������������������������������������������� 295

Chapter 7: Advanced Techniques for Large Language Models����������������������������� 297


Introduction������������������������������������������������������������������������������������������������������������������������������� 297
Fine-Tuning LLMs for Abstractive Summarization�������������������������������������������������������������������� 298
Fine-Tuning an Encoder-Decoder Model����������������������������������������������������������������������������� 299
Abstractive Summarization Using a Decoder-Only Model��������������������������������������������������� 311
Guidelines on Fine-Tuning a Large Language Model���������������������������������������������������������������� 322
Types of SFT (Supervised Fine-Tuning)������������������������������������������������������������������������������� 323
Memory Consumption During SFT��������������������������������������������������������������������������������������� 324
Reinforcement Learning from Human Feedback���������������������������������������������������������������������� 324
What Is RLHF?��������������������������������������������������������������������������������������������������������������������� 325
How Does RLHF Work?�������������������������������������������������������������������������������������������������������� 325
Reward Model Implementation������������������������������������������������������������������������������������������� 328
Controlled Review Generation��������������������������������������������������������������������������������������������� 330
RLHF Summary�������������������������������������������������������������������������������������������������������������������� 347
Summary���������������������������������������������������������������������������������������������������������������������������������� 348

viii
Table of Contents

Chapter 8: Building Demo Applications Using LLMs��������������������������������������������� 349


Making Sense of Website Content�������������������������������������������������������������������������������������������� 349
Data Scraping���������������������������������������������������������������������������������������������������������������������� 351
Question-answering������������������������������������������������������������������������������������������������������������ 353
Summarization�������������������������������������������������������������������������������������������������������������������� 357
User Interface/Application��������������������������������������������������������������������������������������������������� 360
Uncovering Insights and Gaining a Quick Understanding of PDF Documents��������������������������� 368
Question-Answering for PDF����������������������������������������������������������������������������������������������� 369
PDF Summarization������������������������������������������������������������������������������������������������������������� 375
Extracting Insights from Video Transcripts�������������������������������������������������������������������������������� 383
Video Caption Summarization and Q&A������������������������������������������������������������������������������� 384
Video Transcript Analysis Using Langchain and OpenAPI���������������������������������������������������� 394
Summary���������������������������������������������������������������������������������������������������������������������������������� 398

Chapter 9: Building Enterprise-Grade Applications Using LLMs��������������������������� 401


Retrieval-Augmented Question-Answering Chatbot����������������������������������������������������������������� 402
Real-World Use Cases of Retrieval Augmentation Generation��������������������������������������������� 405
RAG Architecture����������������������������������������������������������������������������������������������������������������� 406
Creating a Knowledge Base������������������������������������������������������������������������������������������������ 408
Setting Up a Retrieval System��������������������������������������������������������������������������������������������� 412
Neural Reranker������������������������������������������������������������������������������������������������������������������ 418
Generative LLM�������������������������������������������������������������������������������������������������������������������� 422
User Interface���������������������������������������������������������������������������������������������������������������������� 426
Suggested Improvements in the RAG Pipeline for Generative Q&A������������������������������������� 436
Summary���������������������������������������������������������������������������������������������������������������������������������� 438
Conclusion: Generative AI Journey�������������������������������������������������������������������������������������������� 440

References������������������������������������������������������������������������������������������������������������ 443

Index��������������������������������������������������������������������������������������������������������������������� 449

ix
About the Authors
Shivam R Solanki is an accomplished senior advisory data
scientist leading an AI team in solving challenging problems
using artificial intelligence (AI) in a worldwide partner
ecosystem. Shivam holds a master’s degree from Texas
A&M University with major coursework in applied statistics.
Throughout his career, he has delved into various AI fields,
including machine learning (ML), deep learning (DL), and
natural language processing (NLP). His expertise extends to
Generative AI, where his practical experience and in-depth
knowledge empower him to navigate its intricacies. As a
researcher in AI, Shivam has filed two patents for ML and NLP, co-authored a book on
DL, and published a paper on Generative AI.

Drupad K Khublani is a skilled senior data scientist and part


of the revenue management team in a real estate company.
His leadership in partnering with teams across marketing,
call center operations, product management, customer
experience, and operations has cultivated a wealth of
experience, empowering him to extract actionable insights
and co-create innovative solutions. Drupad completed
graduate and postgraduate programs at the Indian Institute
of Technology (Indian School of Mines) and Texas A&M
University. Collaborating with Dr. Jean-Francois Chamberland on the development of
technology to identify obstacles and gauge distances using only a monocular camera
highlights Drupad’s inventive approach and dedication to real-world applications,
alongside his accomplishments in both the commercial and academic arenas.

xi
About the Technical Reviewer
Durgesh Gurnani is a key influencer in Generative AI,
earning a master’s degree in the United States and currently
residing in Delhi, India. He’s shared his deep knowledge on
TV and at international events. Universities around the world
invite him for special lectures and AI bootcamps. In addition
to his collaborations with multinational companies, Durgesh
conducts online classes every Sunday. Discover his insights
at https://gurnaninotes.com. Join the community and
explore the world of Generative AI with Durgesh.

xiii
Introduction
This book explains the field of generative artificial intelligence (Generative AI), focusing
on its potential and applications, and aims to provide you with an understanding of the
underlying principles, techniques, and practical use cases of Generative AI models.
The book begins with an introduction to the foundations of Generative AI, including
an overview of the field, its evolution, and its significance in today’s AI landscape. Next
it focuses on generative visual models, exploring the exciting field of transforming text
into images and videos. Then it covers text-to-video generation and provides insights
into synthesizing videos from textual descriptions, opening new possibilities for creative
content generation. The next chapter covers generative audio models and prompt-to-
audio synthesis using text-to-speech (TTS) techniques. Then it switched gears, diving
into the realm of generative text models and exploring the concepts of large language
models (LLMs), natural language generation (NLG), fine-tuning, prompt tuning, and
reinforcement learning. The chapters explore techniques for fixing LLMs and making
them grounded and instructible, along with practical applications in enterprise-
grade applications such as question answering, summarization, and knowledge base
generation.
After reading this book, you will understand generative text, audio, and visual
models and have the knowledge and tools necessary to harness the creative and
transformative capabilities of Generative AI.

xv
CHAPTER 1

Introduction to
Generative AI
Unveiling the Magic of Generative AI
Imagine a world where the lines between imagination and reality blur. Generative AI
refers to the subset of artificial intelligence focused on creating new content—from text
to images, music, and beyond—based on learning from vast amounts of data. A few
words whispered into a machine can blossom into a breathtaking landscape painting,
and a simple melody hummed can transform into a hauntingly beautiful symphony.
This isn’t the stuff of science fiction but the exciting reality of Generative AI. You’ve likely
encountered its early forms in autocomplete features in email or text editors, where it
predicts the end of your sentences in surprisingly accurate ways. This transformative
technology isn’t just about analyzing data; it’s about breathing life into entirely new
creations, pushing the boundaries of what we thought machines could achieve.
Gone are the days of static, preprogrammed responses. Generative AI models learn
and adapt, mimicking humans’ ability to observe, understand, and create. These models
decipher the underlying patterns and relationships defining each domain by analyzing
massive images, text, audio, and more datasets. Armed with this knowledge, they can
then transcend mere imitation, generating entirely new content that feels fresh, original,
and often eerily similar to its real-world counterparts.
This isn’t just about novelty, however. Generative AI holds immense potential to
revolutionize various industries and reshape our daily lives. Imagine the following:

Designers: Creating unique and personalized product concepts


based on user preferences.

Musicians: Composing original soundtracks tailored to specific


emotions or moods.

1
© Shivam R Solanki, Drupad K Khublani 2024
S R Solanki and D K Khublani, Generative Artificial Intelligence, https://doi.org/10.1007/979-8-8688-0403-8_1
Chapter 1 Introduction to Generative AI

Writers: Generating creative content formats such as poems,


scripts, or entire novels.

Educators: Personalizing learning experiences with AI-generated


practice problems and interactive narratives.

Scientists: Accelerating drug discovery by simulating complex


molecules and predicting their properties.

From smart assistants crafting detailed travel itineraries to sophisticated photo


editing tools that can alter the time of day in a photograph, Generative AI is weaving its
magic into the fabric of our everyday experiences.
The possibilities are endless, and Generative AI’s magic lies in its versatility. It can be
used for artistic expression, entertainment, education, scientific discovery, and countless
other applications. But what makes this technology truly remarkable is its ability to
collaborate with humans, pushing the boundaries of creativity and innovation in ways
we never thought possible.
So, as you begin your journey into the world of Generative AI, remember this: it’s not
just about the technology itself but about the potential it holds to unlock our creativity
and imagination. With each new model developed and each new application explored,
we inch closer to a future where the line between human and machine-generated
creation becomes increasingly blurred, and the possibilities for what we can achieve
together become genuinely limitless.

The Genesis of Generative AI


The saga of Generative AI unfolds like a tapestry woven from the early threads
of artificial intelligence, evolving through decades of innovation to become the
powerhouse of creativity and problem-solving we see today. From its inception in the
1960s to the flourishing ecosystem of today’s technology, Generative AI has traced a path
of remarkable growth and transformation.

The Initial Spark (1960s): The odyssey commenced with the


development of ELIZA, a simple chatbot devised to simulate human
conversation. Despite its rudimentary capabilities, ELIZA ignited the
imaginations of many, sowing the seeds for future advancements
in natural language processing (NLP) and beyond, laying a
foundational stone for the intricate developments that would follow.
2
Chapter 1 Introduction to Generative AI

The Era of Deep Learning Emergence (1980s–2000s): The


concept of neural networks and deep learning was not new, but it
lay dormant, constrained by the era’s computational limitations.
It wasn’t until the turn of the millennium that a confluence of
enhanced computational power and burgeoning data availability
set the stage for significant breakthroughs, signaling a renaissance
in AI research and development.

Breakthrough with Generative Adversarial Networks (2014):


The introduction of generative adversarial networks (GANs) by
Ian Goodfellow marked a watershed moment for Generative
AI. This innovative framework, consisting of dueling networks—
one generating content and the other evaluating it—ushered in
a new era of image generation, propelling the field toward the
creation of ever more lifelike and complex outputs.

A Period of Rapid Expansion (2010s–present): The landscape


of Generative AI blossomed post-2010, driven by GANs and
advancements in deep learning technologies. This period saw
the diversification of generative models, including convolutional
neural networks (CNNs) and recurrent neural networks (RNNs)
for text and video generation, alongside the emergence of
variational autoencoders and diffusion models for image
synthesis. The development of large language models (LLMs),
starting with GPT-1, demonstrated unprecedented text generation
capabilities, marking a significant leap in the field.

Mainstream Adoption and Ethical Debates (2022): The advent


of user-friendly text-to-image models like Midjourney and
DALL-E 2, coupled with the popularity of OpenAI’s ChatGPT,
catapulted Generative AI into the limelight, making it a household
name. However, this surge in accessibility and utility also brought
to the forefront critical discussions on copyright issues, the
potential displacement of creative professions, and the ethical
use of AI technology, emphasizing the importance of mindful
development and application.

3
Chapter 1 Introduction to Generative AI

Milestones Along the Way


The evolution of Generative AI (see Figure 1-1) has been punctuated by several key
milestones that have significantly shaped its trajectory, pushing the boundaries of what’s
possible and setting new standards for innovation in the field.

Figure 1-1. Generative AI evolution timeline

Reviving Deep Learning (2006): A pivotal moment in the


resurgence of neural networks came with Geoffrey Hinton’s
groundbreaking paper, “A Fast Learning Algorithm for Deep Belief
Nets.” This work reinvigorated interest in restricted Boltzmann
machines (RBMs) and deep learning, laying the groundwork for
future advancements in Generative AI.

The Advent of GANs (2014): Ian Goodfellow and his colleagues


introduced GANs, a novel concept that employs two neural
networks in a form of competitive training. This innovation not
only revolutionized the generation of realistic images but also
opened new avenues for research in unsupervised learning.
Transformer Architecture (2017): The “Attention Is All You
Need” paper by Vaswani et al. introduced the transformer
architecture, fundamentally changing the landscape of NLP. This
architecture, which relies on self-attention mechanisms, has
since become the backbone of LLMs, enabling more efficient and
coherent text generation.

4
Chapter 1 Introduction to Generative AI

Large Language Models Emerge (2018–Present): The


introduction of GPT by OpenAI marked the beginning of the era of
large language models. These models, with their vast capacity for
understanding and generating human-like text, have drastically
expanded the applications of Generative AI, from writing
assistance to conversational AI.

Mainstream Breakthroughs (2022): The release of models


like DALL-E 2 for text-to-image generation and ChatGPT for
conversational AI brought Generative AI into mainstream
awareness. These tools demonstrated the technology’s potential
to the public, showcasing its ability to generate creative, engaging,
and sometimes startlingly lifelike content.

Ethical and Societal Reflections (2022–Present): With greater


visibility came increased scrutiny. The widespread adoption of
Generative AI technologies sparked important conversations
around copyright, ethics, and the impact on creative professions.
This period has highlighted the need for thoughtful consideration
of how these powerful tools are developed and used.

These milestones underscore the rapid pace of advancement in Generative AI,


illustrating a journey of innovation that has transformed the landscape of artificial
intelligence. Each landmark not only represents a leap forward in capabilities but also
sets the stage for the next wave of discoveries, challenging us to envision a future where
AI’s creative potential is harnessed for the greater good while navigating the ethical
complexities it brings.

Fundamentals of Generative Models


With their ability to “dream up” new data, generative models have become a cornerstone
of AI, reshaping how we interact with technology, create content, and solve problems.
This section delves deeper into their inner workings, applications, and limitations,
equipping you to harness their power responsibly.

5
Chapter 1 Introduction to Generative AI

Neural Networks: The Backbone of Generative AI


Neural networks form the foundation of Generative AI, enabling machines to
generate new data instances that mimic the distribution of real data. At their core,
neural networks learn from vast amounts of data, identifying patterns, structures, and
correlations that are not immediately apparent. This learning capability allows them
to produce novel content, from realistic images and music to sophisticated text and
beyond. The versatility and power of neural networks in Generative AI have opened new
frontiers in creativity, automation, and problem-solving, fundamentally changing our
approach to content creation and data analysis.

Key Neural Network Architectures Relevant to Generative AI


Generative AI has been propelled forward by several key neural network architectures,
each bringing unique strengths to the table in terms of learning patterns, processing
sequences, and generating content.

Convolutional Neural Networks


Convolutional neural networks are specialized in processing structured grid data
such as images, making them a cornerstone in visual data analysis and generation. By
automatically and adaptively learning spatial hierarchies of features, CNNs can generate
new images or modify existing ones with remarkable detail and realism. This capability
has been pivotal in advancing fields such as computer vision, where CNNs are used to
create realistic artworks, enhance photos, and even generate entirely new visual content
that is indistinguishable from real-world images. DeepDream, developed by Google, is
an iconic example of CNNs in action. It enhances and modifies images in surreal, dream-
like ways, showcasing CNNs’ ability to interpret and transform visual data creatively.

Recurrent Neural Networks


Recurrent neural networks excel in handling sequential data, making them ideal for
tasks that involve time series, speech, or text. RNNs can remember information for
long durations, and their ability to process sequences of inputs makes them perfect
for generating coherent and contextually relevant text or music. This architecture
has revolutionized natural language processing and generation, enabling the
creation of sophisticated AI chatbots, automated writing assistants, and dynamic

6
Chapter 1 Introduction to Generative AI

music composition software. Google’s Magenta project utilizes RNNs to create new
pieces of music, demonstrating RNNs’ prowess in understanding and generating
complex sequences, such as musical compositions, by learning from vast datasets of
existing music.

Generative Adversarial Networks


Generative adversarial networks consist of two neural networks—the generator and the
discriminator—competing in a zero-sum game framework. This innovative structure
allows GANs to generate highly realistic and detailed images, videos, and even sound.
The competitive nature of GANs pushes them to continually improve, leading to the
generation of content that can often be indistinguishable from real-world data. Their
application ranges from creating photorealistic images and deepfakes to advancing drug
discovery and material design. StyleGAN, developed by NVIDIA, exemplifies GANs’
capabilities by generating highly realistic human faces and objects. This technology has
been used in fashion and design to visualize new products and styles in stunning detail.

Transformers
Transformers have revolutionized the way machines understand and generate human
language, thanks to their ability to process words in relation to all other words in a
sentence, simultaneously. This architecture underpins some of the most advanced
language models like Generative Pre-trained Transformer (GPT), enabling a wide
range of applications from generating coherent and contextually relevant text to
translating languages and summarizing documents. Their unparalleled efficiency in
handling sequential data has made them the model of choice for tasks requiring a
deep understanding of language and context. OpenAI’s GPT-3 showcases the power of
transformer architectures through its ability to generate human-like text across a variety
of applications, from writing articles and poems to coding assistance, illustrating the
model’s deep understanding of language and context.
Transitioning from these architectures, it’s essential to appreciate the distinction
between generative and discriminative models in AI. While the former focuses on
generating new data instances, the latter is concerned with categorizing or predicting
outcomes based on input data. Understanding this difference is crucial for leveraging
the right model for the task at hand, ensuring the effective and responsible use of AI
technologies.

7
Chapter 1 Introduction to Generative AI

 nderstanding the Difference: Generative vs.


U
Discriminative Models
The world of AI models can be vast and complex, but two key approaches stand out:
generative and discriminative models. Though they deal with data and learning, their
goals and functionalities differ significantly.
Generative models, the creative minds of AI, focus on understanding the underlying
patterns and distributions within data. Imagine them as artists studying various styles
and techniques. They analyze the data, learn the “rules” of its creation, and then use
that knowledge to generate entirely new content. This could be anything from realistic
portraits to captivating melodies to even novel text formats.
Discriminative models, on the other hand, function more like meticulous detectives.
Their focus lies on identifying and classifying different types of data. They draw clear
boundaries between categories, enabling them to excel at tasks like image recognition or
spam filtering. While they can recognize a cat from a dog, they can’t create a new image
of either animal on their own.
Here’s an analogy to further illustrate the distinction:

• Imagine you’re learning a new language. A generative model would


immerse itself in the language, analyzing grammar, vocabulary,
and sentence structures. It would then use this knowledge to write
original stories or poems.

• A discriminative model would instead focus on understanding the


differences between different languages. It could then identify which
language a text belongs to but couldn’t compose its own creative text
in that language.

Table 1-1 summarizes the differences.

8
Chapter 1 Introduction to Generative AI

Table 1-1. Generative and Discriminative Comparison


Aspect Generative Models Discriminative Models

Primary focus Understanding and learning the Identifying and classifying data into
distribution of data to generate new categories
instances
Functionality Generates new data samples similar to Classifies input data into predefined
the input data categories
Learning Analyzes and learns the “rules” or Learns the decision boundary between
approach patterns of data creation different classes or categories of data
Key Creative and productive; can create Analytical and selective; focuses
characteristics something new based on learned patterns on distinguishing between existing
categories
Applications Image and text generation (e.g., DALL-E, Spam email filtering; image
GPT-3); music composition (e.g., Google’s recognition (e.g., identifying objects in
Magenta); drug discovery and design photos); fraud detection
Examples Creating realistic images from textual Categorizing emails as spam or not
descriptions; composing original music; spam; recognizing faces in images;
writing poems or stories predicting customer churn
Real-world GPT-3 by OpenAI: uses generative Google Photos: uses discriminative
example modeling to produce human-like text algorithms to categorize and label
photos by faces, places, or things

In essence, generative models are the dreamers, conjuring up new possibilities,


while discriminative models are the analysts, expertly classifying and categorizing
existing data. Both play crucial roles in various fields, and understanding their
differences is essential for choosing the right tool for the right job.

Understanding the Core: Types and Techniques


Generative models are a fascinating and versatile group of algorithms used across a wide
range of applications in artificial intelligence and machine learning. Each model has its
own strengths and is suited to particular types of tasks. Here’s an expanded view of each
generative model mentioned, along with examples of their real-life use cases:

9
Chapter 1 Introduction to Generative AI

Diffusion Models
Diffusion models gradually transform data from a simple distribution into a complex
one and have revolutionized digital art and content creation. They generate realistic
images and animations from textual descriptions and are also applied in enhancing
image resolution, including medical imaging, where they can generate detailed images
for research and training purposes. While Chapter 2 will delve into diffusion models, let’s
build a foundational understanding with some pseudocode first.

import torch
from torch import nn

class DiffusionModel(nn.Module):
  def __init__(self, channels):
    super().__init__()
    # ... (layers for diffusion process)

  def forward(self, x, t):


    # ... (diffusion steps based on time step t)
    return x

Generative Adversarial Networks


GANs consist of two neural networks—the generator and the discriminator—engaged
in a competitive training process. This innovative approach has found widespread
application in creating photorealistic images, deepfake videos, and virtual environments
for video games, as well as in fashion, where designers visualize new clothing on virtual
models before production. To gain a clearer picture of the model’s implementation, let’s
examine the pseudocode.

import torch
from torch import nn

class Generator(nn.Module):
  # ... (generator architecture)

10
Chapter 1 Introduction to Generative AI

class Discriminator(nn.Module):
  # ... (discriminator architecture)

# Train the GAN


# ... (training loop for generator and discriminator)

Variational Autoencoders
Variational autoencoders (VAEs) are renowned for their ability to compress and reconstruct
data, making them ideal for image denoising tasks where they clean up noisy images.
Furthermore, in the pharmaceutical industry, VAEs are utilized to generate new molecular
structures for drug discovery, demonstrating their capacity for innovation in both digital and
physical realms. Let’s delve into the pseudocode to unravel the implementation specifics.

import torch
from torch import nn

class VAE(nn.Module):
  def __init__(self, input_dim, latent_dim):
    super().__init__()
    self.encoder = nn.Sequential(
        # ... (encoder layers)
    )
    self.decoder = nn.Sequential(
        # ... (decoder layers)
    )

  def forward(self, x):


    z = self.encoder(x)
    reconstruction = self.decoder(z)
    return reconstruction, z

Restricted Boltzmann Machines


Restricted Boltzmann machines learn probability distributions over their inputs, making
them instrumental in recommendation systems. By predicting user preferences for
items like movies or products, RBMs personalize recommendations, enhancing user
experience by leveraging learned user-item interaction patterns. By reviewing the
pseudocode, we can better comprehend the practical implementation of this model.
11
Chapter 1 Introduction to Generative AI

import numpy as np

class RBM:
  def __init__(self, visible_size, hidden_size):
    self.weights = np.random.rand(visible_size, hidden_size)
    self.visible_bias = np.zeros(visible_size)
    self.hidden_bias = np.zeros(hidden_size)

  def sample_hidden(self, v):


    # ... (calculate hidden layer probabilities based on visible layer)
    return hidden_states

  def sample_visible(self, h):


    # ... (calculate visible layer probabilities based on hidden layer)
    return visible_states

  def train(self, data, epochs):


    # ... (training loop for weight and bias updates)

Pixel Recurrent Neural Networks


Pixel Recurrent Neural Networks (PixelRNNs) generate coherent and detailed
images pixel by pixel, considering the arrangement of previously generated pixels.
This capability is crucial for generating textures in virtual reality environments or
for photo editing applications where filling in missing parts of images with coherent
detail is required. A walkthrough of the pseudocode will help us grasp the model’s
implementation structure.

import torch
from torch import nn

class PixelRNN(nn.Module):
  def __init__(self, input_dim):
    super().__init__()
    self.rnn = nn.LSTM(input_dim, input_dim)

  def forward(self, x):


    # ... (iterate through pixels, feeding previous output to RNN)
    return generated_image

12
Chapter 1 Introduction to Generative AI

Generative Models in Society and Technology


As we embark on the exploration of generative models, we delve into a domain where
artificial intelligence not only mirrors the complexities of human creativity but also
propels it into new dimensions. These models stand at the confluence of technology
and society, offering groundbreaking solutions, enhancing creative endeavors, and
presenting new challenges. Their integration into various sectors underscores a
transformative era in AI application, where the potential for innovation is boundless yet
accompanied by the imperative of ethical stewardship.

Real-World Applications and Advantages of Generative AI


Generative models are not just about creating new data; their advantages span a wide
array of applications, significantly impacting various facets of human civilization. Their
transformative effects can be seen in the following areas, ordered by their potential to
reshape industries and improve lives:

Healthcare and Medical Research: Generative models are a


boon to healthcare, especially in data-limited areas. They can
synthesize medical data for research, facilitating the development
of diagnostic tools and personalized medicine. This ability to
augment datasets is pivotal for training robust AI systems that can
predict diseases and recommend treatments, potentially saving
lives and improving healthcare outcomes worldwide.

Security and Fraud Detection: In the financial sector, generative


models enhance security by identifying anomalous patterns
indicative of fraudulent transactions. Their capacity to understand
and model normal transactional behavior enables them to
pinpoint outliers with high accuracy, safeguarding financial assets
and consumer trust in banking systems.

Design and Creativity: The impact of generative models in design


and creative industries is profound. They foster innovation by
generating novel concepts in architecture, product design, and
even fashion, challenging traditional boundaries and inspiring

13
Exploring the Variety of Random
Documents with Different Content
three weeks before, and he had blazed away nearly all his
gunpowder. Pouring forth invectives and blaming every one but
himself, Lally decamped on the night of the 17th as secretly and
expeditiously as he could.
In March, 1760, Call was employed in reducing Karikal, and at the
latter end of the year and in the beginning of 1761 he was employed
as chief engineer under Sir Eyre Coote in the reduction of
Pondicherry, which, after it had been battered furiously during two
days, surrendered at discretion. Then the town and fortifications
were levelled with the ground. A few weeks after the strong hill-
fortress of Gingi surrendered, and the military power of the French
in the Carnatic was brought to an end.
In 1762 Call had the good fortune, when serving under General
Cailland, to effect the reduction of the strong fortress of Vellore, one
hundred miles west of Madras, which has since been the point
d'appui of the English power in the Carnatic.
In July, 1763, Mahomed Usuff Cawn, a native of great military talent,
employed in the service of the English, for usurping the government
of Madura and Tinnevelly, the two southernmost provinces of the
peninsula, had to be dealt with summarily. A considerable force
marched against him, under the command of Colonel Monson, of His
Majesty's 69th Regiment. Call acted as chief engineer under him, till
the heavy rains in October obliged the English army to retire from
before Madura. Eventually that place and Palamata were reduced,
and Mahomed Usuff Cawn was taken and hanged.
At the latter end of 1764 Call went into the Travancore country to
settle with the Rajah for the arrears of tribute due to the Nabob of
Arcot. Having satisfactorily accomplished that business and other
concerns with southern princes, he returned to Madras in January,
1765, and took his seat at the Civil Council, to which he was entitled
by rotation, and he obtained the rank of colonel.
During a great part of the war with Hyder Ali in 1767 and 1768 Call
accompanied the army into the Mysore country, and whilst he was
there the Company advanced him to the third seat in the Council,
and he was strongly recommended by Lord Clive to succeed to the
government of Madras on the first vacancy. But news reached him of
the death of his father, and he made up his mind to return to
England. He had managed to scrape together a very considerable
fortune, and he desired to spend the rest of his days in the
enjoyment of it. He embarked on February 8th, 1770, after a service
of nearly twenty years, and he landed at Plymouth on July 26th.

WHITEFORD. THE RESIDENCE OF SIR JOHN CALL


From a drawing in the possession of Mrs. de Lacy Lacy

He bought Whiteford, in the parish of Stoke Climsland, and greatly


enlarged the house. In 1771 he was appointed Sheriff of Cornwall,
and in March, 1772, he married Philadelphia, third daughter of Wm.
Battye, m.d., a somewhat distinguished physician living in
Bloomsbury.
From this period till the autumn of 1782 he lived in retirement at
Whiteford.
Whilst in India, Call had not forgotten his parents and sister at
home, and had sent to his mother priceless Indian shawls, which
she, not knowing their value, cut up and turned into under-
petticoats for herself and daughter and maids. A pipe of Madeira
sent to the father was also as little appreciated. It was distributed
among the farm-labourers during harvest time to economize the
cider.
Now that he was in England and wealthy, he resolved on doing
something for his sister. She had married Cadwalader Jones, the
vicar of the parish, and the vicarage was a small, mean building, so
Cadwalader Jones had taken the manor house that was near the
church on a long lease from the Orchards, who were lords of the
manor. This house had been a cell of Hartland Abbey, but at the
Restoration had been given to the Chammonds. That family had died
out, and now it had come to the Orchards, owners of Hartland
Abbey. Call rebuilt the house, or, to be more exact, built on a
modern house to the old, and installed Cadwalader and his sister in
the new mansion; he also made for them a large walled garden.
When he did this, he was under the impression that the property
belonged to Cadwalader, and not till he had completed his building
did he learn that Mr. Jones had only a lease of it. Moreover, Mrs.
Jones did not live to enjoy the new house very long, as she died in
1780, and then Cadwalader married again. In course of time
Cadwalader went to join his ancestors, and thereupon Mr. Hawkey
saw and loved the widow and the mansion, and married her. Thus it
came about that the manor house built for Mrs. Jane Jones passed
into other hands. But thus it happens also that through Miss
Charlotte Hawkey we have some account of Sir John Call.
Lord Shelburne, when Prime Minister, being desirous of investigating
some of the existing abuses and reforming some of the public
departments, fixed on Call and engaged him along with Mr. Arthur
Holdsworth, of Dartmouth, to inquire into the state and
management of Crown lands, woods, and forests, which had long
been neglected; Call had seen this with regard to the Duchy
property at his doors, and had drawn attention to it. In November,
1782, they made their first report; but a change of Ministry taking
place soon after, their proceedings were interrupted till the Duke of
Portland, then First Lord of the Treasury, authorized them to
continue their investigation. Before they had gone far another
change took place in the Ministry, and Pitt became Prime Minister.
These frequent interruptions interfered with the progress of the
investigation, and to obviate that, in 1785-6 Sir Charles Middleton,
Call, and Holdsworth were appointed permanent Parliamentary
Commissioners.
Call became a banker, a manufacturer of plate-glass, and a copper-
smelter. He designed and saw to the execution of the Bodmin gaol in
1779. He was elected M. P. for Callington in 1784, and retained his
seat till 1801. On July 28th, 1791, he was created a baronet, and
granted as his arms, gules, three trumpets fessewise in pale, or; as
crest, a demi-lion ramp. holding between the paws a trumpet erect,
or.
By his wife he had six children. In 1785 he purchased the famous
house of Field-Marshal Wade, in Old Burlington Street. He became
totally blind in 1795, and died of apoplexy at his residence in town
on March 1st, 1801, and was succeeded in the baronetcy by his son,
William Pratt Call, who died in 1851, leaving a son, William Berkeley
Call, the third baronet, who died in 1864, and with the son of this
latter, Sir William George Montague Call, the fourth baronet, the title
became extinct. It will be noticed that the two last affected
aristocratic Christian names, Berkeley and Montague. Whiteford was
sold to the Duchy of Cornwall, and all the noble trees in the park
were cut down and turned into money, and the mansion converted
into an office for the Duchy. Davies Gilbert, in his Parochial History of
Cornwall, tells a couple of anecdotes of Sir John, but they are too
pointless to merit repetition.
Call was one of those admirable, self-made men who have been
empire-makers in the East, and, better than that, have been makers
of the English name as synonymous with all that is powerful and
true and just. He well deserved the title accorded to him. He was a
man of whom Cornwall may be proud, and it needed no trumpets in
his arms and fictions about the origin of his family to make the name
honourable.
As Dr. Johnson said, "There are some families like potatoes, whose
only good parts are underground."
The authorities for the life of Sir John Call are Playfair's British Family
Antiquity, 1809; Clement R. Markham's Memoir on the Indian
Surveys, 1878; H. G. Nicholl's Forest of Dean; and Neota, by
Charlotte Hawkey, 1871.
The grant of the baronetcy to Sir John Call, dated 1795, is now in
the Museum of the Royal Institution of Cornwall, at Truro.
JOHN KNILL
In August, 1853, appeared the following account in the Gentleman's
Magazine:—
"An eccentric old gentleman of the name Knill, a private secretary
some fifty or sixty years ago to the Lord-Lieutenant of Ireland,
becoming afterwards collector of the port of S. Ives, built a three-
sided pyramid of granite on the top of a high hill, near the town of S.
Ives. The pyramid is represented as a pocket edition of an Egyptian
one, and in it this gentleman caused a chamber to be built, with a
stone coffin, giving out his intention to be buried there, and leaving
a charge on an estate to the corporation of S. Ives for the
maintenance and repair, etc., of the pyramid. He, however, died in
London; and by his latest will, so far from perpetuating the
ostentatious idea, desired that his body should be given up to the
surgeons for dissection, a penance, it is supposed, for past follies,
after which the remains were buried in London. The pyramid,
however, still stands as a landmark. On one side, in raised letters in
granite, appear the words 'Hic jacet nil.' It was understood that the
'K' and another 'l' would be added when the projector should be
placed within; and on the other side, 'Ex nihilo nil fit,' to be filled up
in like manner, Knill. The mausoleum obtained then, and still bears
the name of Knill's Folly."
This account, full of inaccuracies, called forth a letter to the editor
from a relative of John Knill, at Penrose, by Helston, dated October,
1853, which appeared in the November issue of the same magazine.
He stated that John Knill was educated for the law, but did not adopt
it as a profession. He preferred to accept the office of collector of
customs at S. Ives. After a while he was sent as Inspector-General of
Customs to the West Indies, whence he returned to his duties at S.
Ives, after having discharged his office of inspectorship. In 1777 the
Earl of Buckinghamshire, who was recorder of S. Ives, invited Mr.
Knill to accompany him to Ireland as his private secretary, when he,
the earl, had been made lord-lieutenant. The offer was accepted.
In 1782, thirty years before his death, he erected the mausoleum,
partly actuated by a philanthropic motive as affording a landmark to
ships approaching the port, and partly by a wish to find employment
for men at a time of considerable distress, having also a desire to be
buried there, if the ground could be consecrated. This intention was
afterwards abandoned.
Mr. Knill resided for some years previous to his death in Gray's Inn,
and was a bencher of that society. He died there in 1811, and was
buried in the vaults of S. Andrew's, Holborn. On one side of the
monument is the word "Resurgam." On the second side, "I know
that my Redeemer liveth," and on the third is no inscription at all,
and the silly puns given by the informant of the Gentleman's
Magazine had no existence save in the imagination of the
correspondent.
The same writer adds: "Though he had a wide circle of
acquaintances and he was highly esteemed by all who knew him, he
resisted every invitation to dine in private society, and for many
years past dined at Dolly's Coffee House, Paternoster Row, walking
through the chief avenues of the town in the course of the day, in
order to meet his friends and to preserve his health by moderate
exercise."
JOHN KNILL
After a picture by Opie in the possession of Captain Rogers of Penrose

We are able to supplement this scanty record from a memoir of him


by Mr. John Jope Rogers, of Penrose, published in 1871 by Cunnack,
of Helston.
John Knill was born at Callington on January 1st, 1733. His mother
was a Pike of Plympton, and her mother was an Edgcumbe of
Edgcumbe, it is stated in the memoir, but no entry of any such
marriage is in the pedigree of the Edgcumbes in Vivian's Heralds'
Visitations of Devon.
Mr. Knill was very desirous to trace a descent from the family of Knill
of Knill, in Hereford, but entirely failed to do so.
John Knill's mother, one of the seven daughters of Mr. Pike, married
secondly Mr. Jope, and it is thus that the portrait of the subject of
this memoir came into the possession of Mr. John Jope Rogers, of
Penrose, author of the memoir.
John Knill, according to Davies Gilbert, "served his clerkship as an
attorney in Penzance, and from thence removed to the office of a
London attorney, where, having distinguished himself by application
and intelligence, he was recommended to the Earl of
Buckinghamshire, who, at that time, held the political interests of S.
Ives, to be his local agent." In the year 1762 he was appointed
collector of customs at S. Ives, in Cornwall, and held it during twenty
years, at the end of which time he wrote to Mr. William Praed, March
30th, 1782: "I purpose to be in London in May, in order to resign my
office of collector, which I shall finally quit at the end of next
midsummer quarter."
In November, 1767, he was chosen mayor of S. Ives, and lived in a
red-brick house facing the beach, in Fore Street. Although mayor
and collector of customs, it was strongly believed that he was in
league with smugglers and wreckers.
One day, during the latter half of the eighteenth century, a strange
vessel ran on the rocks on the Hayle side of Carrick Gladden, and
the crew escaped to land and disappeared. The ship, now a derelict,
had apparently no owner, and next day a number of people boarded
her, and found her full of chinaware and other smuggled goods. The
ship's papers could not be found; they had been carried off when
the crew deserted her, and it was strongly supposed that they were
destroyed, as implicating Knill and Praed, of Trevetho. The customs
officer, Roger Wearne, went on board and stuffed his clothes full of
china; having a pair of trousers on with a very ample and baggy
seat, he thought he could not do better than stow away some of the
choicest pieces of porcelain there. But as he was getting down the
side of the ship into the boat, very leisurely, so as not to injure his
spoils, a comrade, getting impatient, struck him on the posteriors
with the blade of his oar, shouting to him, "Look out sharp, Wearne!"
and was startled at the cracking noise that ensued, and the howl of
Wearne when the broken splinters of china entered his flesh.
In 1773 the Government sent him to Jamaica to inspect the ports
there; he remained in the West Indies one year, and used his eyes
and ears, for in 1779 he wrote an account of the religion of the
Coromandel negroes for Bryant Edwards' History of the West Indies,
from information he then and there gathered. For his services he
received from the Board of Customs the substantial sum of £1500.
He returned to his duties at S. Ives in 1774. In 1777 he became
private secretary to the Earl of Buckinghamshire, in Dublin, but he
returned to S. Ives after six months in Ireland. In 1779 he
speculated in a bootless search for treasure, which the notorious
pirate, Captain John Avery, was supposed, on his return from
Madagascar, to have secreted near the Lizard. But, as none of the
Lives of that freebooter gave any hint of his having done so, the
attempt was not the least likely to lead to satisfactory results. Davies
Gilbert says that Knill equipped some small vessels to act as
privateers against smugglers, but if local tradition may be relied on,
these vessels were only nominally for this purpose, and were
actually engaged in running contraband goods; but this is highly
improbable.
GLASS INSCRIBED "SUCCESS TO THE EAGLE FRIGATE, JOHN KNILL
COMMANDER"
From the Collection of Percy Bate, Esq. of Glasgow

In 1782 he was employed in the service of the customs as inspector


of some of the western ports, making occasional visits to London,
where he settled for the rest of his days. In 1784 he purchased
chambers in Gray's Inn Square, where he died on March 29th, 1811,
at the age of seventy-seven. He was painted by Opie in 1779,
dressed in a plain suit of blue, with frilled shirt and ruffles. He made
his half-brother, the Rev. John Jope, of S. Cleer, his sole executor.
It was in the year 1782 that John Knill erected his mausoleum on
Worral Hill, on land purchased from Henry, Lord Arundell, for five
guineas. The total cost of the monument was £226 1s. 6d. Sixpence
a year is paid to the owner of Tregenna for a right of way to the
obelisk. By a deed dated May 29th, 1797, Knill settled upon the
mayor and capital burgesses of S. Ives, and their successors for
ever, an annuity of £10 as a rent-charge, to be paid out of the manor
of Glivian, in Mawgan, which sum is annually to be put into a chest
which is not to be opened except at the end of every five years.
Then, out of the accumulated sum, a dinner was to be given to the
mayor, collector of customs, and vicar of S. Ives, and two friends to
be invited by each of them, and £15 to be equally divided among ten
girls, natives of S. Ives, under ten years old, who should, between
10 a.m. and noon on S. James the Apostle's Day, dance and sing
round the mausoleum, to the fiddling of a man who was to receive a
pound for so doing and for fiddling as the procession of girls went to
the obelisk and returned. One pound was to be laid out in white
ribbons for the damsels and a cockade for the fiddler. Some of the
money was to go to keep the mausoleum in repair, and there were
certain benefactions also recorded.
The first Knillian celebration took place in July, 1801, when,
according to the will of the founder, a band of little girls, all dressed
in white, with two widows and a company of musicians, marched in
procession to the top of the hill, where they danced about the
monument, then, as Knill desired, sang the Hundredth Psalm to its
old melody, and after that returned in the same order to S. Ives. The
ceremony still takes place every fifth year.
In dancing the children sing the following in chorus:—

Shun the bustle of the bay,


Hasten, virgins, come away;
Hasten to the mountain's brow,
Leave, O leave, S. Ives below.
Haste to breathe a purer air,
Virgins fair, and pure as fair;
Fly S. Ives and all her treasures,
Fly her soft voluptuous pleasures;
Fly her sons and all their wiles,
Lushing in their wanton smiles;
Fly the splendid midnight halls;
Fly the revels of her balls;
Fly, O fly the chosen seat,
Where vanity and fashion meet.
Hither hasten from the ring,
Round the tomb in chorus sing,
And on the lofty mountain's brow, aptly dight,
Just as we should be, all in white,
Leave all our troubles and our cares below.
THOMAS TREGOSS
A certain Roscadden going on a pilgrimage in the days before the
Reformation, and being absent some years, was surprised on his
return to find that his wife had borne one if not more children. Very
much and very naturally put out, he consulted with one John
Tregoss, who advised him to settle his estate upon some friend
whom he could trust, for the use and benefit of his children whom
he would own, and for the wife not to be left absolutely destitute in
the event of his death. Mr. Roscadden approved of this counsel, and
constituted John Tregoss his heir absolutely, but always with the
understanding that the said Tregoss should administer his estate
according to the wishes and instructions of Roscadden. But this
gentleman dying soon after, John Tregoss entered on possession of
the estate, "turned the wife and children out of doors, who for some
time were fain to lye in an hog-stye, and every morning went forth
to the Dung-hill, and there upon their faces imprecated and prayed
that the vengeance of God might fall upon Tregoss and his posterity
for this so perfidious and merciless deed.
"And after this, God's severe but righteous judgments fell upon
Tregoss's family. For his son Walter, one day riding upon a Horse in a
fair way, the horse threw him, and broke his neck: and some of his
issue came to untimely ends, and it is observed that a curse hath
remained ever since: and this Mr. Tregoss of whom we write was so
sensible of it, that it cost him many fervent prayers to God for the
removal of that dreadful curse, as himself assured a bosom friend"—
but it does not seem to have occurred to him to give up the heritage
to the Roscaddens—that is, if he were the possessor.
The family of Tregose, or Tregosse, was one of the oldest in the
neighbourhood of S. Ives. The names of Clement and John Tregose
of S. Ives appear in the Subsidy Roll of 1327. In the list of circa
1520, Thomas Tregoos' lands in Towednack were assessed at the
yearly value of 13s. 4d., and those of John Tregoz, in the parish of S.
Ives, at 11s.; but Thomas also had lands at S. Ives, valued the same
as those of John.
In 1641, William Tregose, gent., had at S. Ives goods to the annual
value of £3.
Thomas Tregoss, the subject of this notice, was the son of William
Tregoss of S. Ives. His parents were strong Puritans and very
austere, and they hedged about their son with restrictions, not
suffering him to partake in games or any childish relaxations from
the strain of study or the contemplation of religious themes. At first
he seemed to be of poor capacity, but at the age of seven years he
began to show that he had a quick apprehension and a retentive
memory. Cut off from all worldly distractions, he was allowed but
one direction in which his faculties and his ambitions could stretch
and expand. He had not the force of character and strength of will to
revolt against the numbing restraints that bound him in. His only
play as a boy was standing on a chair and preaching to his fellow
pupils.
He was sent to Oxford and admitted into Exeter College, and after a
few years spent there, returned to S. Ives; and as the Parliamentary
Commissioners had ejected the vicar, he was thrust in as Puritan
preacher in 1657, and he then married a Margaret Sparrow of the
same way of thinking.
The life of Thomas Tregoss, as given by Samuel Clark in his Lives of
Some Eminent Persons, 1683, is interspersed with Remarkable
Providences and Extraordinary Judgments, but for the most part
they are neither remarkable nor interesting.
The following is, perhaps, an exception:—
Shortly after his arrival at S. Ives, in the summer, the greater portion
of the fishing season had passed without the pilchards appearing,
and this to the great distress of the people. By the advice of Tregoss
a day was set apart for humiliation and prayer, and next day a shoal
of pilchards arrived.
In the ensuing summer the fishermen, having taken a great number
of fishes on the Saturday, wanted to spread and dry their nets on
the Sunday. Tregoss learning this, came forth and rebuked and
denounced God's judgment on them if they should profane the
"Sabbath" in this manner. They did not hearken to him, observing
that their nets must be dried or would rot. From that day no more
pilchards visited the bay during that season.
From S. Ives Tregoss was transferred to Mylor in October, 1659, but
was ejected from the living on August 24th, 1660, as not ordained,
and unwilling to receive ordination, and to subscribe to the articles
and confirm to the liturgy. However, he continued to preach to a
privately assembled number of puritanically minded people, and he
was proceeded against and committed to the custody of the marshal
in Launceston gaol, where he remained for three months, and was
then released by order of the Deputy Lieutenant.
In September, 1663, he removed to Kigilliath, near Penryn. On
October 1st, 1664, whilst he and his wife were lying awake in bed,
they experienced an earthquake shock, and this he held to be "a
symbolick image of that trembling Heartquake which he shortly felt
in his conversion."
On January 1st ensuing, he fell into deep despondency and the spirit
of bondage—his liver being probably out of order—till he fancied
himself relieved by receiving the spirit of adoption. He had been
converted half a dozen times before, but never before preceded by
an earthquake, so that there could be no mistake about its reality
this time.
Fired with new zeal, he broke into Mabe church at the head of a
number of his adherents, mounted the pulpit, and harangued his
congregation. For this he was arrested and imprisoned again in
Launceston gaol, but was shortly released, July 29th, 1665; and he
had the pleasing satisfaction of knowing that a bull had gored
Justice Thomas Robinson, who had sent him to prison.
Undeterred by what he had gone through, he again invaded Mabe
church, and was again committed to gaol on September 18th, but
was once more released, on December 14th.
On February 4th, 1666, he once more broke into the parish church
of Mabe at the head of a body of Puritans, and was again arrested
and sent to the marshal at Bodmin, but by the order of the King was
at once set free.
In 1669 he was at Great Torrington, where he preached, and was
sent to Exeter gaol, but was at once bailed out. He died at Penryn in
January, 1672.
On September 4th, 1775, John Wesley preached at S. Ives "in the
little meadow above the town." He wrote in his diary that "the
people in general here (excepting the rich) seem almost persuaded
to be Christians. Perhaps the prayer of their old pastor, Mr. Tregoss,
is answered even to the fourth generation."
ANTHONY PAYNE
Anthony Payne, the "Falstaff of the West," was born in the manor
house, Stratton, the son of a tenant farmer, under the Grenvilles of
Stowe. The registers do not go back sufficiently far to record the
date of his birth. The Tree Inn is the ancient manor house in which
the giant first saw the light. He rapidly shot up to preternatural size
and strength. So vast were his proportions as a boy, that his
schoolmates were accustomed to work out their arithmetic lessons in
chalk on his back, and sometimes even thereon to delineate a map
of the world, so that he might return home, like Atlas, carrying the
world on his shoulders for his father with a stick to dust out.
It was his delight to tuck two urchins under his arms, one on each
side, and climb, so encumbered with "his kittens," as he called them,
to a height overhanging the sea, to their infinite terror, and this he
would call "showing them the world." A proverb still extant in
Cornwall, expressive of some unusual length, is "As long as Tony
Payne's foot."
At the age of twenty-one he was taken into the establishment at
Stowe. He then measured seven feet two inches in height without
his shoes, and he afterwards grew two inches higher. He was not tall
and lanky, but stout and well proportioned in every way. The original
mansion of the Grenvilles at Stowe still in part remains as a
farmhouse. The splendid house of Stowe, built by the first Earl of
Bath, was pulled down shortly after 1711, and it was said that men
lived who had seen the stately palace raised and also levelled with
the dust. This was at a little distance further inland than the old
Stowe that remains. The Grenvilles had also a picturesque house at
Broom Hill, near Bude, with fine Elizabethan plaster-work ceilings,
now converted into labourers' cottages.
At Stowe Anthony Payne delighted in exhibiting his strength. In the
hurling-ground a rough block of stone is still pointed out as "Payne's
cast," lying full ten paces beyond the reach whereat the ordinary
player could "put the stone."
It is said that one Christmas Eve the fire languished in the hall. A
boy with an ass had been sent into the wood for faggots. Payne
went to hurry him back, and caught up the ass and his burden, flung
them over his shoulder, and brought both into the hall and cast them
down by the side of the fire.
On another occasion, being defied to perform the feat, he carried a
bacon-hog from Kilkhampton to Stowe. Then came the Civil War,
when Charles I and his Parliament sought to settle their differences
on the battlefield. Cornwall went for the King, and Anthony Payne
had the drilling and manœuvring of the recruits from Kilkhampton
and Stratton. At one time Sir Beville Grenville had his head-quarters
at Truro, but the great battle of Stamford Hill, May 16th, 1643, was
fought but eight miles from Stowe, and on the night preceding it Sir
Beville Grenville slept in his house at Broom Hill. The battle was
desperate, the Royalist soldiers being outnumbered, and attacked;
amidst them was Anthony Payne, mounted on his sturdy cob
Samson, rallying his troopers and terrorizing the enemy, who fled. At
the next pitched battle at Lansdown, near Bath, the forces of the
King were defeated and Sir Beville was killed. Anthony Payne, having
mounted John Grenville, then a youth of sixteen, on his father's
horse, had led on the Grenville troops to the fight. The Rev. R. S.
Hawker gives a letter from the giant to Lady Grace Grenville,
conveying to her the news of the death of her husband; but it is
more than doubtful whether this be genuine. He says of it: "It still
survives. It breathes, in the quaint language of the day, a noble
strain of sympathy and homage." It does not exist except in Mr.
Hawker's book, and is almost certainly a fabrication by him.
ANTONY PAYNE
From the picture by Sir Godfrey Kneller

At the Restoration, Sir John Grenville was created Earl of Bath, and
was made governor of the garrison of Plymouth, and he then
appointed Payne halberdier of the guns. The King, who held Payne
in great favour, made him a yeoman of his guards, and Sir Godfrey
Kneller, the Court artist, was employed to paint his portrait.
Whilst in Plymouth garrison an incident occurred that has been
recorded by Hawker. At the mess-table of the regiment, during the
reign of William and Mary, on the anniversary of the day when
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookname.com

You might also like