- Level Professional
- Duration 22 hours
- Course by University of California, Santa Cruz
- Offered by
About
This course for practicing and aspiring data scientists and statisticians. It is the fourth of a four-course sequence introducing the fundamentals of Bayesian statistics. It builds on the course Bayesian Statistics: From Concept to Data Analysis, Techniques and Models, and Mixture models. Time series analysis is concerned with modeling the dependency among elements of a sequence of temporally related variables. To succeed in this course, you should be familiar with calculus-based probability, the principles of maximum likelihood estimation, and Bayesian inference. You will learn how to build models that can describe temporal dependencies and how to perform Bayesian inference and forecasting for the models. You will apply what you've learned with the open-source, freely available software R with sample databases. Your instructor Raquel Prado will take you from basic concepts for modeling temporally dependent data to implementation of specific classes of modelsModules
Introduction
- 1 Videos
- 2 Readings
- 1 Assignment
1 Assignment
- Objectives of the course
1 Videos
- Welcome to Bayesian Statistics: Time Series
2 Readings
- Introduction to R
- List of References
Stationarity, the ACF and the PACF
- 3 Videos
- 4 Readings
- 1 Assignment
1 Assignment
- Stationarity, the ACF and the PACF
3 Videos
- Stationarity
- The autocorrelation function (ACF)
- ACF, PACF, Differencing and Smoothing: Examples
4 Readings
- The partial autocorrelation function (PACF)
- Differencing and Smoothing
- R Code: Differencing and filtering via moving averages
- R Code: Simulate data from a white noise process
The AR(1) process: Definition and properties
- 2 Videos
- 2 Readings
- 1 Assignment
1 Assignment
- The AR(1) definitions and properties
2 Videos
- The AR(1)
- Simulating from an AR(1) process
2 Readings
- The PACF of the AR(1) process
- R Code: Sample data from AR(1) processes
The AR(1): Maximum likelihood estimation and Bayesian inference
- 3 Videos
- 4 Readings
- 1 PeerReview
- 1 Assignment
1 Assignment
- MLE and Bayesian inference in the AR(1)
1 Peer Review
- MLE and Bayesian inference in the AR(1)
3 Videos
- Maximum likelihood estimation in the AR(1)
- Bayesian inference in the AR(1)
- Bayesian inference in the AR(1): Conditional likelihood example
4 Readings
- Review of maximum likelihood and Bayesian inference in regression
- R code: MLE for the AR(1), examples
- R Code: AR(1) Bayesian inference, conditional likelihood example
- Bayesian inference in the AR(1), full likelihood example
The general AR(p) process
- 4 Videos
- 3 Readings
- 1 Assignment
1 Assignment
- Properties of AR processes
4 Videos
- Definition and state-space representation
- Examples
- ACF of the AR(p)
- Simulating data from an AR(p)
3 Readings
- Rcode: Computing the roots of the AR polynomial
- Rcode: Simulating data from an AR(p)
- The AR(p): Review
Bayesian inference in the AR(p)
- 5 Videos
- 5 Readings
- 1 PeerReview
- 1 Assignment
1 Assignment
- Spectral representation of the AR(p)
1 Peer Review
- Bayesian analysis of an EEG dataset using an AR(p)
5 Videos
- Bayesian inference in the AR(p): Reference prior, conditional likelihood
- Model order selection
- Example: Bayesian inference in the AR(p), conditional likelihood
- Spectral representation of the AR(p)
- Spectral representation of the AR(p): Example
5 Readings
- Rcode: Maximum likelihood estimation, AR(p), conditional likelihood
- Rcode: Bayesian inference, AR(p), conditional likelihood
- Rcode: Model order selection
- Rcode: Spectral density of AR(p)
- ARIMA processes
The Normal Dynamic Linear Model: Definition, model classes, and the superposition principle
- 4 Videos
- 2 Readings
- 1 Assignment
1 Assignment
- The Normal Dynamic Linear Model
4 Videos
- NDLM: Definition
- Polynomial trend models
- Regression models
- The superposition principle
2 Readings
- Summary of polynomial trend and regression models
- Superposition principle: General case
Bayesian inference in the NDLM: Part I
- 6 Videos
- 5 Readings
- 1 PeerReview
1 Peer Review
- NDLM: sensitivity to the model parameters
6 Videos
- Filtering
- Filtering in the NDLM: Example
- Smoothing and forecasting
- Smoothing in the NDLM: Example
- Second order polynomial: Filtering and smoothing example
- Using the dlm package in R
5 Readings
- Summary of the filtering distributions
- Rcode. Filtering in the NDLM: Example
- Summary of the smoothing and forecasting distributions
- Rcode: Smoothing in the NDLM, Example
- Rcode: Using the dlm package in R
Testing branch
- 1 Assignment
1 Assignment
- NDLM, Part I: Review
Seasonal NDLMs
- 2 Videos
- 2 Readings
- 1 Assignment
1 Assignment
- Seasonal Models and Superposition
2 Videos
- Fourier representation
- Building NDLMs with multiple components: Examples
2 Readings
- Fourier Representation: Example 1
- Summary: DLM Fourier representation
Bayesian inference in the NDLM: Part II
- 3 Videos
- 2 Readings
- 1 PeerReview
1 Peer Review
- NDLM data analysis
3 Videos
- Filtering, Smoothing and Forecasting: Unknown observational variance
- Specifying the system covariance matrix via discount factors
- NDLM, Unknown Observational Variance: Example
2 Readings
- Summary of Filtering, Smoothing and Forecasting Distributions, NDLM unknown observational variance
- Rcode: NDLM, Unknown Observational Variance Example
Case studies
- 2 Videos
2 Videos
- EEG data
- Google trends
Testing branch
- 1 Assignment
1 Assignment
- NDLM, Part II
Data Analysis Project
- 1 PeerReview
1 Peer Review
- Data Analysis Project
Raquel Prado