
- 1,062 pages
- English
- ePUB (mobile friendly)
- Available on iOS & Android
About this book
This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches -which are based on optimization techniques – together with the Bayesian inference approach, whose essence lies in the use of a hierarchy of probabilistic models.The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts.The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as short courses on sparse modeling, deep learning, and probabilistic graphical models.- All major classical techniques: Mean/Least-Squares regression and filtering, Kalman filtering, stochastic approximation and online learning, Bayesian classification, decision trees, logistic regression and boosting methods.- The latest trends: Sparsity, convex analysis and optimization, online distributed algorithms, learning in RKH spaces, Bayesian inference, graphical and hidden Markov models, particle filtering, deep learning, dictionary learning and latent variables modeling.- Case studies - protein folding prediction, optical character recognition, text authorship identification, fMRI data analysis, change point detection, hyperspectral image unmixing, target localization, channel equalization and echo cancellation, show how the theory can be applied.- MATLAB code for all the main algorithms are available on an accompanying website, enabling the reader to experiment with the code.
Frequently asked questions
- Essential is ideal for learners and professionals who enjoy exploring a wide range of subjects. Access the Essential Library with 800,000+ trusted titles and best-sellers across business, personal growth, and the humanities. Includes unlimited reading time and Standard Read Aloud voice.
- Complete: Perfect for advanced learners and researchers needing full, unrestricted access. Unlock 1.4M+ books across hundreds of subjects, including academic and specialized titles. The Complete Plan also includes advanced features like Premium Read Aloud and Research Assistant.
Please note we cannot support devices running on iOS 13 and Android 7 or earlier. Learn more about using the app.
Information
Table of contents
- Cover image
- Title page
- Table of Contents
- Copyright
- Preface
- Acknowledgments
- Notation
- Dedication
- Chapter 1: Introduction
- Chapter 2: Probability and Stochastic Processes
- Chapter 3: Learning in Parametric Modeling: Basic Concepts and Directions
- Chapter 4: Mean-Square Error Linear Estimation
- Chapter 5: Stochastic Gradient Descent: The LMS Algorithm and its Family
- Chapter 6: The Least-Squares Family
- Chapter 7: Classification: A Tour of the Classics
- Chapter 8: Parameter Learning: A Convex Analytic Path
- Chapter 9: Sparsity-Aware Learning: Concepts and Theoretical Foundations
- Chapter 10: Sparsity-Aware Learning: Algorithms and Applications
- Chapter 11: Learning in Reproducing Kernel Hilbert Spaces
- Chapter 12: Bayesian Learning: Inference and the EM Algorithm
- Chapter 13: Bayesian Learning: Approximate Inference and Nonparametric Models
- Chapter 14: Monte Carlo Methods
- Chapter 15: Probabilistic Graphical Models: Part I
- Chapter 16: Probabilistic Graphical Models: Part II
- Chapter 17: Particle Filtering
- Chapter 18: Neural Networks and Deep Learning
- Chapter 19: Dimensionality Reduction and Latent Variables Modeling
- Appendix A: Linear Algebra
- Appendix B: Probability Theory and Statistics
- Appendix C: Hints on Constrained Optimization
- Index