Skip to content

Generative AI with Python and PyTorch , Second Edition - Published by Packt

License

Notifications You must be signed in to change notification settings

PacktPublishing/Generative-AI-with-Python-and-PyTorch-Second-Edition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Generative AI with Python and PyTorch, Second Edition

This is the code repository for Generative AI with Python and PyTorch, Second Edition, published by Packt.

Hands-on projects and cutting-edge techniques using generative adversarial networks and LLMs

Joseph Babcock, Raghav Bali

      Free PDF       Graphic Bundle       Amazon      

About the book

Generative AI with Python and PyTorch, Second Edition

Become an expert in generative AI through practical projects to leverage cutting-edge models for Natural Language Processing (NLP) and computer vision. Generative AI with Python and PyTorch, Second Edition, is your comprehensive guide to creating advanced AI applications. Leveraging Python, this book provides a detailed exploration of the latest generative AI technologies.

From NLP to image generation, this edition dives into practical applications and the underlying theories that enable these technologies. By integrating the latest advancements and applications of large language models, this book prepares you to design and implement powerful AI systems that transform data into actionable insights.

You’ll build your LLM toolbox by learning about various models, tools, and techniques, including GPT-4, LangChain, RLHF, LoRA, and retrieval augmented generation. This deep learning book shows you how to generate images and apply styler transfer using GANs, before implementing CLIP and diffusion models.

Whether you’re creating dynamic content or developing complex AI-driven solutions, Generative AI with Python and PyTorch, Second Edition, equips you with the knowledge to use Python and AI to their full potential.

Key Learnings

  • Understand the core concepts behind large language models and their capabilities
  • Craft effective prompts using chain-of-thought, ReAct, and prompt query language to guide LLMs toward your desired outputs
  • Learn how attention and transformers have changed NLP
  • Optimize your diffusion models by combining them with VAEs
  • Build several text generation pipelines based on LSTMs and LLMs
  • Leverage the power of open-source LLMs, such as Llama and Mistral, for various tasks

Chapters

Chapters Colab Kaggle Gradient Studio Lab
Chapter 1: An Introduction to Generative AI: ""Drawing"" Data from Models
Chapter 2: Building Blocks of Deep Neural Networks
Chapter 3: The Rise of Methods for Text Generation
  • 01_word2vec.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 02_fasttext.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 03_character_language_model.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 4: NLP 2.0: Using Transformers to Generate Text
  • 01_positional_encodings.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 02_encoder_transformers_nlp_tasks.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 03_gpt2_headlines_generator.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 5: Foundations of LLMs
  • 01_instruction_tuning.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 02_RLHF_gpt2_positive_reviewer.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 6: Open Source LLMs
Chapter 7: Prompt Engineering
  • 01_prompt_engineering.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 8: LLM Toolbox/Ecosystem
  • Chapter8.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 9: LLM Optimisation Techniques
  • 01_llm_training_and_scaling.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 02_pretraining_optimizations.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 03_finetuning_optimizations.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 04_instruction_tuning_llama_t2sql.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 10: Emerging Applications in Generative AI
Chapter 11: Neural Networks Using VAEs
  • Chapter11.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 12: Image Generation with GANs
  • 01_vanilla_gan.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 02_deep_convolutional_gan.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 03_conditional_gan.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 04_progressive_gan.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 13: Style Transfer with GANs
  • cyclegan.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • pix2pix.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 14: Deepfakes with GANs
  • 01_dlib_facial_landmarks_demo.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 02_face_recognition_demo.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 03_reenactment_pix2pix_training.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
  • 04_reactment_pix2pix.ipynb
Open In Colab
Open In Kaggle
Open In Gradient
Open In Studio Lab
Chapter 15: Diffusion Models and AI Art

Requirements for this book

Basic understanding of Python syntax and programming experience will help you understand the majority of the code base. Additionally, an intermediate-level understanding of concepts related to machine learning and deep learning would enable you to appreciate and understand complex generative models and techniques discussed throughout the book. A quick setup guide is as follows:

Hardware (Minimum)

  • 512 GB HDD
  • 32 GB RAM
  • Intel Core i5 processor or better / Apple Silicon M1 or better
  • Access to a 32 GB graphics card or better (T4 or better)

Software

  • Python 3.11 and above
  • PyTorch 2.5.x and above

Browser

  • Chrome, Safari, or Firefox for executing code directly via Google Colab or other cloud services

Get to know Authors

Joseph Babcock Joseph Babcock has spent over a decade working with big data and AI in the e-commerce, digital streaming, and quantitative finance domains. Throughout his career, he has worked on recommender systems, petabyte-scale cloud data pipelines, A/B testing, causal inference, and time series analysis. He completed his PhD studies at Johns Hopkins University, applying machine learning to drug discovery and genomics.

Raghav Bali Raghav Bali is a Staff Data Scientist at Delivery Hero, a leading food delivery service headquartered in Berlin, Germany. With 12+ years of expertise, he specializes in research and development of enterprise-level solutions leveraging Machine Learning, Deep Learning, Natural Language Processing, and Recommendation Engines for practical business applications. Besides his professional endeavors, Raghav is an esteemed mentor and an accomplished public speaker. He has contributed to multiple peer-reviewed papers and authored multiple well received books. Additionally, he holds co-inventor credits on multiple patents in healthcare, machine learning, deep learning, and natural language processing.

Other Related Books

About

Generative AI with Python and PyTorch , Second Edition - Published by Packt

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •