Transformers for Natural Language Processing
eBook - ePub

Transformers for Natural Language Processing

Denis Rothman

Partager le livre
  1. 564 pages
  2. English
  3. ePUB (adapté aux mobiles)
  4. Disponible sur iOS et Android
eBook - ePub

Transformers for Natural Language Processing

Denis Rothman

DĂ©tails du livre
Aperçu du livre
Table des matiĂšres
Citations

À propos de ce livre

OpenAI's GPT-3, ChatGPT, GPT-4 and Hugging Face transformers for language tasks in one book. Get a taste of the future of transformers, including computer vision tasks and code writing and assistance.Purchase of the print or Kindle book includes a free eBook in PDF format

Key Features

  • Improve your productivity with OpenAI's ChatGPT and GPT-4 from prompt engineering to creating and analyzing machine learning models
  • Pretrain a BERT-based model from scratch using Hugging Face
  • Fine-tune powerful transformer models, including OpenAI's GPT-3, to learn the logic of your data

Book Description

Transformers are...well...transforming the world of AI. There are many platforms and models out there, but which ones best suit your needs?Transformers for Natural Language Processing, 2nd Edition, guides you through the world of transformers, highlighting the strengths of different models and platforms, while teaching you the problem-solving skills you need to tackle model weaknesses.You'll use Hugging Face to pretrain a RoBERTa model from scratch, from building the dataset to defining the data collator to training the model.If you're looking to fine-tune a pretrained model, including GPT-3, then Transformers for Natural Language Processing, 2nd Edition, shows you how with step-by-step guides.The book investigates machine translations, speech-to-text, text-to-speech, question-answering, and many more NLP tasks. It provides techniques to solve hard language problems and may even help with fake news anxiety (read chapter 13 for more details).You'll see how cutting-edge platforms, such as OpenAI, have taken transformers beyond language into computer vision tasks and code creation using DALL-E 2, ChatGPT, and GPT-4.By the end of this book, you'll know how transformers work and how to implement them and resolve issues like an AI detective.

What you will learn

  • Discover new techniques to investigate complex language problems
  • Compare and contrast the results of GPT-3 against T5, GPT-2, and BERT-based transformers
  • Carry out sentiment analysis, text summarization, casual speech analysis, machine translations, and more using TensorFlow, PyTorch, and GPT-3
  • Find out how ViT and CLIP label images (including blurry ones!) and create images from a sentence using DALL-E
  • Learn the mechanics of advanced prompt engineering for ChatGPT and GPT-4

Who this book is for

If you want to learn about and apply transformers to your natural language (and image) data, this book is for you.You'll need a good understanding of Python and deep learning and a basic understanding of NLP to benefit most from this book. Many platforms covered in this book provide interactive user interfaces, which allow readers with a general interest in NLP and AI to follow several chapters. And don't worry if you get stuck or have questions; this book gives you direct access to our AI/ML community to help guide you on your transformers journey!

]]>

Foire aux questions

Comment puis-je résilier mon abonnement ?
Il vous suffit de vous rendre dans la section compte dans paramĂštres et de cliquer sur « RĂ©silier l’abonnement ». C’est aussi simple que cela ! Une fois que vous aurez rĂ©siliĂ© votre abonnement, il restera actif pour le reste de la pĂ©riode pour laquelle vous avez payĂ©. DĂ©couvrez-en plus ici.
Puis-je / comment puis-je télécharger des livres ?
Pour le moment, tous nos livres en format ePub adaptĂ©s aux mobiles peuvent ĂȘtre tĂ©lĂ©chargĂ©s via l’application. La plupart de nos PDF sont Ă©galement disponibles en tĂ©lĂ©chargement et les autres seront tĂ©lĂ©chargeables trĂšs prochainement. DĂ©couvrez-en plus ici.
Quelle est la différence entre les formules tarifaires ?
Les deux abonnements vous donnent un accĂšs complet Ă  la bibliothĂšque et Ă  toutes les fonctionnalitĂ©s de Perlego. Les seules diffĂ©rences sont les tarifs ainsi que la pĂ©riode d’abonnement : avec l’abonnement annuel, vous Ă©conomiserez environ 30 % par rapport Ă  12 mois d’abonnement mensuel.
Qu’est-ce que Perlego ?
Nous sommes un service d’abonnement Ă  des ouvrages universitaires en ligne, oĂč vous pouvez accĂ©der Ă  toute une bibliothĂšque pour un prix infĂ©rieur Ă  celui d’un seul livre par mois. Avec plus d’un million de livres sur plus de 1 000 sujets, nous avons ce qu’il vous faut ! DĂ©couvrez-en plus ici.
Prenez-vous en charge la synthÚse vocale ?
Recherchez le symbole Écouter sur votre prochain livre pour voir si vous pouvez l’écouter. L’outil Écouter lit le texte Ă  haute voix pour vous, en surlignant le passage qui est en cours de lecture. Vous pouvez le mettre sur pause, l’accĂ©lĂ©rer ou le ralentir. DĂ©couvrez-en plus ici.
Est-ce que Transformers for Natural Language Processing est un PDF/ePUB en ligne ?
Oui, vous pouvez accĂ©der Ă  Transformers for Natural Language Processing par Denis Rothman en format PDF et/ou ePUB ainsi qu’à d’autres livres populaires dans Ciencia de la computaciĂłn et Aplicaciones de escritorio. Nous disposons de plus d’un million d’ouvrages Ă  dĂ©couvrir dans notre catalogue.

Informations

Année
2022
ISBN
9781803243481

Index

Symbols
345M-parameter GPT-2 model
downloading 475, 476
A
accuracy score 125
Allen Institute for AI
reference link 257
AllenNLP 343
URL 343
Amazon Web Services (AWS) 1, 12, 392
artificial intelligence, properties
computing power 6
data 5
model architecture 5
prompt engineering 6
attention heads 459
attention masks
creating 76
Automated Machine Learning (AutoML) 120
automatic question generation 303, 304
B
BERT-based transformer
architecture 258
basic samples 261-267
difficult samples 267-273
running 258
SRL experiments 259, 260
BERT-base multilingual model 324, 325
BERT model
architecture 62
attention masks, creating 76
batch size, selecting 77
BERT tokenizer, activating 75
BERT tokens, adding 75
configuration 78, 80
CUDA, specifying as device for torch 72
data, converting into torch tensors 77
data, processing 76
dataset, loading 73-75
data, splitting into training set 76
data, splitting into validation set 76
encoder stack 62-65
fine-tuning 68-70
hardware constraints 71
holdout dataset, used for evaluating 86, 87
holdout dataset, used for predicting 86, 87
Hugging Face BERT uncased base model, loading 80-82
Hugging Face PyTorch interface, installing 71
hyperparameters for training loop 83
iterator, creating 77
key features 68
label lists, creating 75
Matthews Correlation...

Table des matiĂšres