
- 272 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
Transfer Learning for Natural Language Processing
About this book
Build custom NLP models in record time by adapting pre-trained machine learning models to solve specialized problems. Summary
In Transfer Learning for Natural Language Processing you will learn: Fine tuning pretrained models with new domain data
Picking the right model to reduce resource usage
Transfer learning for neural network architectures
Generating text with generative pretrained transformers
Cross-lingual transfer learning with BERT
Foundations for exploring NLP academic literature Training deep learning NLP models from scratch is costly, time-consuming, and requires massive amounts of data. In Transfer Learning for Natural Language Processing, DARPA researcher Paul Azunre reveals cutting-edge transfer learning techniques that apply customizable pretrained models to your own NLP architectures. You'll learn how to use transfer learning to deliver state-of-the-art results for language comprehension, even when working with limited label data. Best of all, you'll save on training time and computational costs. Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications. About the technology
Build custom NLP models in record time, even with limited datasets! Transfer learning is a machine learning technique for adapting pretrained machine learning models to solve specialized problems. This powerful approach has revolutionized natural language processing, driving improvements in machine translation, business analytics, and natural language generation. About the book
Transfer Learning for Natural Language Processing teaches you to create powerful NLP solutions quickly by building on existing pretrained models. This instantly useful book provides crystal-clear explanations of the concepts you need to grok transfer learning along with hands-on examples so you can practice your new skills immediately. As you go, you'll apply state-of-the-art transfer learning methods to create a spam email classifier, a fact checker, and more real-world applications. What's inside Fine tuning pretrained models with new domain data
Picking the right model to reduce resource use
Transfer learning for neural network architectures
Generating text with pretrained transformers About the reader
For machine learning engineers and data scientists with some experience in NLP. About the author
Paul Azunre holds a PhD in Computer Science from MIT and has served as a Principal Investigator on several DARPA research programs. Table of Contents
PART 1 INTRODUCTION AND OVERVIEW
1 What is transfer learning?
2 Getting started with baselines: Data preprocessing
3 Getting started with baselines: Benchmarking and optimization
PART 2 SHALLOW TRANSFER LEARNING AND DEEP TRANSFER LEARNING WITH RECURRENT NEURAL NETWORKS (RNNS)
4 Shallow transfer learning for NLP
5 Preprocessing data for recurrent neural network deep transfer learning experiments
6 Deep transfer learning for NLP with recurrent neural networks
PART 3 DEEP TRANSFER LEARNING WITH TRANSFORMERS AND ADAPTATION STRATEGIES
7 Deep transfer learning for NLP with the transformer and GPT
8 Deep transfer learning for NLP with BERT and multilingual BERT
9 ULMFiT and knowledge distillation adaptation strategies
10 ALBERT, adapters, and multitask adaptation strategies
11 Conclusions
Trusted by 375,005 students
Access to over 1 million titles for a fair monthly price.
Study more efficiently using our study tools.
Information
Table of contents
- Transfer Learning for Natural Language Processing
- Copyright
- dedication
- contents
- front matter
- Part 1 Introduction and overview
- 1 What is transfer learning?
- 2 Getting started with baselines: Data preprocessing
- 3 Getting started with baselines: Benchmarking and optimization
- Part 2 Shallow transfer learning and deep transfer learning with recurrent neural networks (RNNs)
- 4 Shallow transfer learning for NLP
- 5 Preprocessing data for recurrent neural network deep transfer learning experiments
- 6 Deep transfer learning for NLP with recurrent neural networks
- Part 3 Deep transfer learning with transformers and adaptation strategies
- 7 Deep transfer learning for NLP with the transformer and GPT
- 8 Deep transfer learning for NLP with BERT and multilingual BERT
- 9 ULMFiT and knowledge distillation adaptation strategies
- 10 ALBERT, adapters, and multitask adaptation strategies
- 11 Conclusions
- appendix A Kaggle primer
- appendix B Introduction to fundamental deep learning tools
- index
Frequently asked questions
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app