Building Transformer Models with PyTorch 2.0
eBook - ePub

Building Transformer Models with PyTorch 2.0

NLP, computer vision, and speech processing with PyTorch and Hugging Face (English Edition)

  1. English
  2. ePUB (mobile friendly)
  3. Available on iOS & Android
eBook - ePub

Building Transformer Models with PyTorch 2.0

NLP, computer vision, and speech processing with PyTorch and Hugging Face (English Edition)

About this book

Your key to transformer based NLP, vision, speech, and multimodalities

Key Features
? Transformer architecture for different modalities and multimodalities.
? Practical guidelines to build and fine-tune transformer models.
? Comprehensive code samples with detailed documentation.

Description
This book covers transformer architecture for various applications including NLP, computer vision, speech processing, and predictive modeling with tabular data. It is a valuable resource for anyone looking to harness the power of transformer architecture in their machine learning projects.The book provides a step-by-step guide to building transformer models from scratch and fine-tuning pre-trained open-source models. It explores foundational model architecture, including GPT, VIT, Whisper, TabTransformer, Stable Diffusion, and the core principles for solving various problems with transformers. The book also covers transfer learning, model training, and fine-tuning, and discusses how to utilize recent models from Hugging Face. Additionally, the book explores advanced topics such as model benchmarking, multimodal learning, reinforcement learning, and deploying and serving transformer models.In conclusion, this book offers a comprehensive and thorough guide to transformer models and their various applications.

What you will learn
? Understand the core architecture of various foundational models, including single and multimodalities.
? Step-by-step approach to developing transformer-based Machine Learning models.
? Utilize various open-source models to solve your business problems.
? Train and fine-tune various open-source models using PyTorch 2.0 and the Hugging Face ecosystem.
? Deploy and serve transformer models.
? Best practices and guidelines for building transformer-based models.

Who this book is for
This book caters to data scientists, Machine Learning engineers, developers, and software architects interested in the world of generative AI.

Table of Contents
1. Transformer Architecture
2. Hugging Face Ecosystem
3. Transformer Model in PyTorch
4. Transfer Learning with PyTorch and Hugging Face
5. Large Language Models: BERT, GPT-3, and BART
6. NLP Tasks with Transformers
7. CV Model Anatomy: ViT, DETR, and DeiT
8. Computer Vision Tasks with Transformers
9. Speech Processing Model Anatomy: Whisper, SpeechT5, and Wav2Vec
10. Speech Tasks with Transformers
11. Transformer Architecture for Tabular Data Processing
12. Transformers for Tabular Data Regression and Classification
13. Multimodal Transformers, Architectures and Applications
14. Explore Reinforcement Learning for Transformer
15. Model Export, Serving, and Deployment
16. Transformer Model Interpretability, and Experimental Visualization
17. PyTorch Models: Best Practices and Debugging

Frequently asked questions

Yes, you can cancel anytime from the Subscription tab in your account settings on the Perlego website. Your subscription will stay active until the end of your current billing period. Learn how to cancel your subscription.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Perlego offers two plans: Essential and Complete
  • Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
  • Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Both plans are available with monthly, semester, or annual billing cycles.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes! You can use the Perlego app on both iOS or Android devices to read anytime, anywhere — even offline. Perfect for commutes or when you’re on the go.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Yes, you can access Building Transformer Models with PyTorch 2.0 by Prem Timsina in PDF and/or ePUB format, as well as other popular books in Computer Science & Artificial Intelligence (AI) & Semantics. We have over one million books available in our catalogue for you to explore.

Table of contents

  1. Cover
  2. Title Page
  3. Copyright Page
  4. Dedication Page
  5. About the Author
  6. About the Reviewer
  7. Acknowledgement
  8. Preface
  9. Table of Contents
  10. 1. Transformer Architecture
  11. 2. Hugging Face Ecosystem
  12. 3. Transformer Model in PyTorch
  13. 4. Transfer Learning with PyTorch and Hugging Face
  14. 5. Large Language Models:BERT, GPT-3, and BART
  15. 6. NLP Tasks with Transformers
  16. 7. CV Model Anatomy: ViT, DETR, and DeiT
  17. 8. Computer Vision Tasks with Transformers
  18. 9. Speech Processing Model Anatomy: Whisper, SpeechT5, and Wav2Vec
  19. 10. Speech Tasks with Transformers
  20. 11. Transformer Architecture for Tabular Data Processing
  21. 12. Transformers for Tabular Data Regression and Classification
  22. 13.  Multimodal Transformers, Architectures and Applications
  23. 14. Explore Reinforcement Learning for Transformer
  24. 15. Model Export, Serving, and Deployment
  25. 16. Transformer Model Interpretability, and Experimental Visualization
  26. 17. PyTorch Models: Best Practices
  27. Index