- Level Professional
- Duration 22 hours
- Course by University of California, Santa Cruz
-
Offered by
About
Bayesian Statistics: Mixture Models introduces you to an important class of statistical models. The course is organized in five modules, each of which contains lecture videos, short quizzes, background reading, discussion prompts, and one or more peer-reviewed assignments. Statistics is best learned by doing it, not just watching a video, so the course is structured to help you learn through application. Some exercises require the use of R, a freely-available statistical software package. A brief tutorial is provided, but we encourage you to take advantage of the many other resources online for learning R if you are interested. This is an intermediate-level course, and it was designed to be the third in UC Santa Cruz's series on Bayesian statistics, after Herbie Lee's "Bayesian Statistics: From Concept to Data Analysis" and Matthew Heiner's "Bayesian Statistics: Techniques and Models." To succeed in the course, you should have some knowledge of and comfort with calculus-based probability, principles of maximum-likelihood estimation, and Bayesian estimation.Modules
Introduction
1
Videos
- Welcome to Bayesian Statistics: Mixture Models
The R Environment for Statistical Computing
1
Videos
- Installing and using R
1
Readings
- An Introduction to R
Definition of mixture models
4
Assignment
- Basic definitions
- Mixtures of Gaussians
- Zero-inflated distributions
- Definition of Mixture Models
1
Discussions
- When are mixture models helpful?
3
Videos
- Basic definitions
- Mixtures of Gaussians
- Zero-inflated mixtures
5
Readings
- Example of a bimodal mixture of Gaussians
- Example of a unimodal and skewed mixture of Gaussians
- Example of a unimodal, symmetric and heavy tailed mixture of Gaussians
- Example of a zero-inflated negative binomial distribution
- Example of a zero-inflated log Gaussian distribution
Likelihood function for mixture models
3
Assignment
- The likelihood function
- Identifiability
- Likelihood function for mixture models
4
Videos
- Hierarchical representations
- Sampling from a mixture model
- The likelihood function
- Parameter identifiability
1
Readings
- Sample code for simulating from a Mixture Model
Advanced simulation and likelihood function
2
Peer Review
- Simulating from a Mixture Model
- Likelihood function for mixture models
The EM algorithm for Mixture Models
1
Discussions
- Mixtures of log-Gaussians
4
Videos
- EM for general mixtures
- EM for location mixtures of Gaussians
- EM example 1
- EM example 2
2
Readings
- Sample code for EM example 1
- Sample code for EM example 2
Advanced EM algorithm
2
Peer Review
- The EM algorithm for zero-inflated mixtures
- The EM algorithm for Mixture Models
Markov chain Monte Carlo algorithms for Mixture Models
6
Videos
- Markov Chain Monte Carlo algorithms part 1
- Markov Chain Monte Carlo algorithms, part 2
- MCMC for location mixtures of normals Part 1
- MCMC for location mixtures of normals Part 2
- MCMC Example 1
- MCMC Example 2
2
Readings
- Sample code for MCMC example 1
- Sample code for MCMC example 2
Advanced MCMC
2
Peer Review
- The MCMC algorithm for zero-inflated mixtures
- Markov chain Monte Carlo algorithms for Mixture Models
Density estimation
2
Videos
- Density estimation using Mixture Models
- Density Estimation Example
1
Readings
- Sample code for density estimation problems
Clustering
2
Videos
- Mixture Models for Clustering
- Clustering example
1
Readings
- Sample EM algorithm for clustering problems
Classification
3
Videos
- Mixture Models and naive Bayes classifiers
- Linear and quadratic discriminant analysis in the context of Mixture Models
- Classification example
1
Readings
- Sample EM algorithm for classification problems
Advanced Density Estimation and Classification
3
Peer Review
- The EM algorithm and density estimation
- MCMC algorithms and density estimation
- Classification
Computational considerations for Mixture Models
1
Assignment
- Computational considerations for Mixture Models
2
Videos
- Numerical stability
- Computational issues associated with multimodality
3
Readings
- Sample code to illustrate numerical stability issues
- Sample code to illustrate multimodality issues 1
- Sample code to illustrate multimodality issues 2
Determining the number of components in a Mixture Model
3
Assignment
- Bayesian Information Criteria (BIC)
- Estimating the number of components in Bayesian settings
- Estimating the partition structure in Bayesian models
1
Discussions
- Simplifying Binder's expected loss function
5
Videos
- Bayesian Information Criteria (BIC)
- Bayesian Information Criteria Example
- Estimating the number of components in Bayesian settings
- Estimating the full partition structure in Bayesian settings
- Example: Bayesian inference for the partition structure
2
Readings
- Sample code: Bayesian Information Criteria
- Sample code for estimating the number of components and the partition structure in Bayesian models
Advanced BIC
1
Peer Review
- BIC for zero-inflated mixtures
Auto Summary
Bayesian Statistics: Mixture Models is an intermediate course in Data Science & AI offered by Coursera, taught by UC Santa Cruz. Designed for professionals, it delves into statistical models through five modules featuring lectures, quizzes, readings, and peer-reviewed assignments. Learners will apply concepts using R, enhancing their skills in Bayesian estimation and calculus-based probability over 1320 minutes. Ideal for those with foundational knowledge in Bayesian statistics, the course is part of a series and available with a Starter subscription.

Abel Rodriguez