Bayesian Analysis with Python
eBook - ePub

Bayesian Analysis with Python

Osvaldo Martin

Buch teilen
  1. 282 Seiten
  2. English
  3. ePUB (handyfreundlich)
  4. Über iOS und Android verfügbar
eBook - ePub

Bayesian Analysis with Python

Osvaldo Martin

Angaben zum Buch
Buchvorschau
Inhaltsverzeichnis
Quellenangaben

Über dieses Buch

Unleash the power and flexibility of the Bayesian framework

About This Book

  • Simplify the Bayes process for solving complex statistical problems using Python;
  • Tutorial guide that will take the you through the journey of Bayesian analysis with the help of sample problems and practice exercises;
  • Learn how and when to use Bayesian analysis in your applications with this guide.

Who This Book Is For

Students, researchers and data scientists who wish to learn Bayesian data analysis with Python and implement probabilistic models in their day to day projects. Programming experience with Python is essential. No previous statistical knowledge is assumed.

What You Will Learn

  • Understand the essentials Bayesian concepts from a practical point of view
  • Learn how to build probabilistic models using the Python library PyMC3
  • Acquire the skills to sanity-check your models and modify them if necessary
  • Add structure to your models and get the advantages of hierarchical models
  • Find out how different models can be used to answer different data analysis questions
  • When in doubt, learn to choose between alternative models.
  • Predict continuous target outcomes using regression analysis or assign classes using logistic and softmax regression.
  • Learn how to think probabilistically and unleash the power and flexibility of the Bayesian framework

In Detail

The purpose of this book is to teach the main concepts of Bayesian data analysis. We will learn how to effectively use PyMC3, a Python library for probabilistic programming, to perform Bayesian parameter estimation, to check models and validate them. This book begins presenting the key concepts of the Bayesian framework and the main advantages of this approach from a practical point of view. Moving on, we will explore the power and flexibility of generalized linear models and how to adapt them to a wide array of problems, including regression and classification. We will also look into mixture models and clustering data, and we will finish with advanced topics like non-parametrics models and Gaussian processes. With the help of Python and PyMC3 you will learn to implement, check and expand Bayesian models to solve data analysis problems.

Style and approach

Bayes algorithms are widely used in statistics, machine learning, artificial intelligence, and data mining. This will be a practical guide allowing the readers to use Bayesian methods for statistical modelling and analysis using Python.

Häufig gestellte Fragen

Wie kann ich mein Abo kündigen?
Gehe einfach zum Kontobereich in den Einstellungen und klicke auf „Abo kündigen“ – ganz einfach. Nachdem du gekündigt hast, bleibt deine Mitgliedschaft für den verbleibenden Abozeitraum, den du bereits bezahlt hast, aktiv. Mehr Informationen hier.
(Wie) Kann ich Bücher herunterladen?
Derzeit stehen all unsere auf Mobilgeräte reagierenden ePub-Bücher zum Download über die App zur Verfügung. Die meisten unserer PDFs stehen ebenfalls zum Download bereit; wir arbeiten daran, auch die übrigen PDFs zum Download anzubieten, bei denen dies aktuell noch nicht möglich ist. Weitere Informationen hier.
Welcher Unterschied besteht bei den Preisen zwischen den Aboplänen?
Mit beiden Aboplänen erhältst du vollen Zugang zur Bibliothek und allen Funktionen von Perlego. Die einzigen Unterschiede bestehen im Preis und dem Abozeitraum: Mit dem Jahresabo sparst du auf 12 Monate gerechnet im Vergleich zum Monatsabo rund 30 %.
Was ist Perlego?
Wir sind ein Online-Abodienst für Lehrbücher, bei dem du für weniger als den Preis eines einzelnen Buches pro Monat Zugang zu einer ganzen Online-Bibliothek erhältst. Mit über 1 Million Büchern zu über 1.000 verschiedenen Themen haben wir bestimmt alles, was du brauchst! Weitere Informationen hier.
Unterstützt Perlego Text-zu-Sprache?
Achte auf das Symbol zum Vorlesen in deinem nächsten Buch, um zu sehen, ob du es dir auch anhören kannst. Bei diesem Tool wird dir Text laut vorgelesen, wobei der Text beim Vorlesen auch grafisch hervorgehoben wird. Du kannst das Vorlesen jederzeit anhalten, beschleunigen und verlangsamen. Weitere Informationen hier.
Ist Bayesian Analysis with Python als Online-PDF/ePub verfügbar?
Ja, du hast Zugang zu Bayesian Analysis with Python von Osvaldo Martin im PDF- und/oder ePub-Format sowie zu anderen beliebten Büchern aus Computer Science & Data Modelling & Design. Aus unserem Katalog stehen dir über 1 Million Bücher zur Verfügung.

Information

Jahr
2016
ISBN
9781785883804

Bayesian Analysis with Python


Table of Contents

Bayesian Analysis with Python
Credits
About the Author
About the Reviewer
www.PacktPub.com
eBooks, discount offers, and more
Why subscribe?
Preface
What this book covers
What you need for this book
Who this book is for
Conventions
Reader feedback
Customer support
Downloading the example code
Downloading the color images of this book
Errata
Piracy
Questions
1. Thinking Probabilistically - A Bayesian Inference Primer
Statistics as a form of modeling
Exploratory data analysis
Inferential statistics
Probabilities and uncertainty
Probability distributions
Bayes' theorem and statistical inference
Single parameter inference
The coin-flipping problem
The general model
Choosing the likelihood
Choosing the prior
Getting the posterior
Computing and plotting the posterior
Influence of the prior and how to choose one
Communicating a Bayesian analysis
Model notation and visualization
Summarizing the posterior
Highest posterior density
Posterior predictive checks
Installing the necessary Python packages
Summary
Exercises
2. Programming Probabilistically – A PyMC3 Primer
Probabilistic programming
Inference engines
Non-Markovian methods
Grid computing
Quadratic method
Variational methods
Markovian methods
Monte Carlo
Markov chain
Metropolis-Hastings
Hamiltonian Monte Carlo/NUTS
Other MCMC methods
PyMC3 introduction
Coin-flipping, the computational approach
Model specification
Pushing the inference button
Diagnosing the sampling process
Convergence
Autocorrelation
Effective size
Summarizing the posterior
Posterior-based decisions
ROPE
Loss functions
Summary
Keep reading
Exercises
3. Juggling with Multi-Parametric and Hierarchical Models
Nuisance parameters and marginalized distributions
Gaussians, Gaussians, Gaussians everywhere
Gaussian inferences
Robust inferences
Student's t-distribution
Comparing groups
The tips dataset
Cohen's d
Probability of superiority
Hierarchical models
Shrinkage
Summary
Keep reading
Exercises
4. Understanding and Predicting Data with Linear Regression Models
Simple linear regression
The machine learning connection
The core of linear regression models
Linear models and high autocorrelation
Modifying the data before running
Changing the sampling method
Interpreting and visualizing the posterior
Pearson correlation coefficient
Pearson coefficient from a multivariate Gaussian
Robust linear regression
Hierarchical linear regression
Correlation, causation, and the messiness of life
Polynomial regression
Interpreting the parameters of a polynomial regression
Polynomial regression – the ultimate model?
Multiple linear regression
Confounding variables and redundant variables
Multicollinearity or when the correlation is too high
Masking effect variables
Adding interactions
The GLM module
Summary
Keep reading
Exercises
5. Classifying Outcomes with Logistic Regression
Logistic regression
The logistic model
The iris dataset
The logistic model applied to the iris dataset
Making predictions
Multiple logistic regression
The boundary decision
Implementing the model
Dealing with correlated variables
Dealing with unbalanced classes
How do we solve this problem?
Interpreting the coefficients of a logistic regression
Generalized linear models
Softmax regression or multinomial logistic regression
Discriminative and generative models
Summary
Keep reading
Exercises
6. Model Comparison
Occam's razor – simplicity and accuracy
Too many parameters leads to overfitting
Too few parameters leads to underfitting
The balance between simplicity and accuracy
Regularizing priors
Regularizing priors and hierarchical models
Predictive accuracy measures
Cross-validation
Information criteria
The log-likelihood and the deviance
Akaike information criterion
Deviance information criterion
Widely available information criterion
Pareto smoothed importance sampling leave-one-out cross-validation
Bayesian information criterion
Computing information criteria with PyMC3
A note on the reliability of WAIC and LOO computations
Interpreting and using information criteria measures
Posterior predictive checks
Bayes factors
Analogy with information criteria
Computing Bayes factors
Common problems computing Bayes factors
Bayes factors and information criteria
Summary
Keep reading
Exercises
7. Mixture Models
Mixture models
How to build mixture models
Marginalized Gaussian mixture model
Mixture models and count data
The Poisson distribution
The Zero-Inflated Poisson model
Poisson regression and ZIP regression
Robust logistic regression
Model-based clustering
Fixed component clustering
Non-fixed component clustering
Continuous mixtures
Beta-binomial and negative binomial
The Student's t-distribution
Summary
Keep reading
Exercises
8. Gaussian Processes
Non-parametric statistics
Kernel-based models
The Gaussian kernel
Kernelized linear regression
Overfitting and priors
Gaussian process...

Inhaltsverzeichnis