
- 352 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
Ensemble Methods for Machine Learning
About this book
Ensemble machine learning combines the power of multiple machine learning approaches, working together to deliver models that are highly performant and highly accurate. Inside Ensemble Methods for Machine Learning you will find:
- Methods for classification, regression, and recommendations
- Sophisticated off-the-shelf ensemble implementations
- Random forests, boosting, and gradient boosting
- Feature engineering and ensemble diversity
- Interpretability and explainability for ensemble methods
Ensemble machine learning trains a diverse group of machine learning models to work together, aggregating their output to deliver richer results than a single model. Now in Ensemble Methods for Machine Learning you'll discover core ensemble methods that have proven records in both data science competitions and real-world applications. Hands-on case studies show you how each algorithm works in production. By the time you're done, you'll know the benefits, limitations, and practical methods of applying ensemble machine learning to real-world data, and be ready to build more explainable ML systems. About the Technology Automatically compare, contrast, and blend the output from multiple models to squeeze the best results from your data. Ensemble machine learning applies a "wisdom of crowds" method that dodges the inaccuracies and limitations of a single model. By basing responses on multiple perspectives, this innovative approach can deliver robust predictions even without massive datasets. About the Book Ensemble Methods for Machine Learning teaches you practical techniques for applying multiple ML approaches simultaneously. Each chapter contains a unique case study that demonstrates a fully functional ensemble method, with examples including medical diagnosis, sentiment analysis, handwriting classification, and more. There's no complex math or theory—you'll learn in a visuals-first manner, with ample code for easy experimentation! What's Inside
- Bagging, boosting, and gradient boosting
- Methods for classification, regression, and retrieval
- Interpretability and explainability for ensemble methods
- Feature engineering and ensemble diversity
About the Reader For Python programmers with machine learning experience. About the Author Gautam Kunapuli has over 15 years of experience in academia and the machine learning industry. Table of Contents PART 1 - THE BASICS OF ENSEMBLES
1 Ensemble methods: Hype or hallelujah?
PART 2 - ESSENTIAL ENSEMBLE METHODS
2 Homogeneous parallel ensembles: Bagging and random forests
3 Heterogeneous parallel ensembles: Combining strong learners
4 Sequential ensembles: Adaptive boosting
5 Sequential ensembles: Gradient boosting
6 Sequential ensembles: Newton boosting
PART 3 - ENSEMBLES IN THE WILD: ADAPTING ENSEMBLE METHODS TO YOUR DATA
7 Learning with continuous and count labels
8 Learning with categorical features
9 Explaining your ensembles
Frequently asked questions
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Information
Table of contents
- inside front cover
- Ensemble Methods for Machine Learning
- Copyright
- dedication
- contents
- front matter
- Part 1 The basics of ensembles
- 1 Ensemble methods: Hype or hallelujah?
- Part 2 Essential ensemble methods
- 2 Homogeneous parallel ensembles: Bagging and random forests
- 3 Heterogeneous parallel ensembles: Combining strong learners
- 4 Sequential ensembles: Adaptive boosting
- 5 Sequential ensembles: Gradient boosting
- 6 Sequential ensembles: Newton boosting
- Part 3 Ensembles in the wild: Adapting ensemble methods to your data
- 7 Learning with continuous and count labels
- 8 Learning with categorical features
- 9 Explaining your ensembles
- epilogue
- index
- inside back cover