- Level Expert
- Duration 53 hours
- Course by Imperial College London
-
Offered by
About
Welcome to this course on Probabilistic Deep Learning with TensorFlow! This course builds on the foundational concepts and skills for TensorFlow taught in the first two courses in this specialisation, and focuses on the probabilistic approach to deep learning. This is an increasingly important area of deep learning that aims to quantify the noise and uncertainty that is often present in real world datasets. This is a crucial aspect when using deep learning models in applications such as autonomous vehicles or medical diagnoses; we need the model to know what it doesn't know. You will learn how to develop probabilistic models with TensorFlow, making particular use of the TensorFlow Probability library, which is designed to make it easy to combine probabilistic models with deep learning. As such, this course can also be viewed as an introduction to the TensorFlow Probability library. You will learn how probability distributions can be represented and incorporated into deep learning models in TensorFlow, including Bayesian neural networks, normalising flows and variational autoencoders. You will learn how to develop models for uncertainty quantification, as well as generative models that can create new samples similar to those in the dataset, such as images of celebrity faces. You will put concepts that you learn about into practice straight away in practical, hands-on coding tutorials, which you will be guided through by a graduate teaching assistant. In addition there is a series of automatically graded programming assignments for you to consolidate your skills. At the end of the course, you will bring many of the concepts together in a Capstone Project, where you will develop a variational autoencoder algorithm to produce a generative model of a synthetic image dataset that you will create yourself. This course follows on from the previous two courses in the specialisation, Getting Started with TensorFlow 2 and Customising Your Models with TensorFlow 2. The additional prerequisite knowledge required in order to be successful in this course is a solid foundation in probability and statistics. In particular, it is assumed that you are familiar with standard probability distributions, probability density functions, and concepts such as maximum likelihood estimation, change of variables formula for random variables, and the evidence lower bound (ELBO) used in variational inference.Modules
Introduction to the course
1
Assignment
- [Knowledge check] Standard distributions
1
Discussions
- Introduce yourself
3
Videos
- Welcome to Probabilistic Deep Learning with TensorFlow 2
- Interview with Paige Bailey
- The TensorFlow Probability library
4
Readings
- About Imperial College & the team
- How to be successful in this course
- Grading policy
- Additional readings & helpful references
Univariate distributions
1
Labs
- [Coding tutorial] Univariate distributions
2
Videos
- Univariate distributions
- [Coding tutorial] Univariate distributions
Multivariate distributions
2
Labs
- [Coding tutorial] Multivariate distributions
- [Reading] Multivariate Gaussian with full covariance
2
Videos
- Multivariate distributions
- [Coding tutorial] Multivariate distributions
The Independent distribution
1
Labs
- [Coding tutorial] The Independent distribution
2
Videos
- The Independent distribution
- [Coding tutorial] The Independent distribution
Broadcasting rules
1
Labs
- [Reading] Broadcasting rules
Sampling and log probs
1
Labs
- [Coding tutorial] Sampling and log probs
2
Videos
- Sampling and log probs
- [Coding tutorial] Sampling and log probs
Trainable distributions
1
Labs
- [Coding tutorial] Trainable distributions
2
Videos
- Trainable distributions
- [Coding tutorial] Trainable distributions
Programming Assignment: Naive Bayes and logistic regression
- Naive Bayes and logistic regression
1
Labs
- Naive Bayes and logistic regression
1
Videos
- Wrap up and introduction to the programming assignment
Introduction to the week
1
Assignment
- Sources of uncertainty
2
Videos
- Welcome to week 2 - Probabilistic layers and Bayesian neural networks
- The need for uncertainty in deep learning models
Maximum likelihood estimation
1
Labs
- [Reading] Maximum likelihood estimation
The DistributionLambda layer
1
Labs
- [Coding tutorial] The DistributionLambda layer
2
Videos
- The DistributionLambda layer
- [Coding tutorial] The DistributionLambda layer
Probabilistic layers
1
Labs
- [Coding tutorial] Probabilistic layers
2
Videos
- Probabilistic layers
- [Coding tutorial] Probabilistic layers
Bayes by backprop
1
Labs
- [Reading] Bayes by backprop
The DenseVariational layer
1
Labs
- [Coding tutorial] The DenseVariational layer
2
Videos
- The DenseVariational layer
- [Coding tutorial] The DenseVariational layer
Reparameterization layers
1
Labs
- [Coding tutorial] Reparameterization layers
2
Videos
- Reparameterization layers
- [Coding tutorial] Reparameterization layers
Programming Assignment: Bayesian convolutional neural network
- Bayesian convolutional neural network
1
Labs
- Bayesian convolutional neural network
1
Videos
- Wrap up and introduction to the programming assignment
Introduction to the week
1
Assignment
- Change of variables formula
1
Labs
- [Reading] Change of variables formula
2
Videos
- Welcome to week 3 - Bijectors and normalising flows
- Interview with Doug Kelly
Bijectors
1
Labs
- [Coding tutorial] Bijectors
2
Videos
- Bijectors
- [Coding tutorial] Bijectors
Scale bijectors and LinearOperator
1
Labs
- [Reading] Scale bijectors and LinearOperator
The TransformedDistribution class
1
Labs
- [Coding tutorial] The Transformed Distribution class
2
Videos
- The TransformedDistribution class
- [Coding tutorial] The Transformed Distribution class
Subclassing bijectors
1
Labs
- [Coding tutorial] Subclassing bijectors
2
Videos
- Subclassing bijectors
- [Coding tutorial] Subclassing bijectors
Normalising flows
2
Labs
- [Reading] Autoregressive flows and RealNVP
- [Coding tutorial] Normalising flows
3
Videos
- Autoregressive flows
- RealNVP
- [Coding tutorial] Normalising flows
Programming Assignment: RealNVP
- RealNVP
1
Labs
- RealNVP
1
Videos
- Wrap up and introduction to the programming assignment
Introduction to the week
1
Assignment
- Variational autoencoders
1
Labs
- [Reading] Variational autoencoders
1
Videos
- Welcome to week 4 - Variational autoencoders
Encoders and decoders
1
Labs
- [Coding tutorial] Encoders and decoders
2
Videos
- Encoders and decoders
- [Coding tutorial] Encoders and decoders
Kullback-Leibler divergence
3
Labs
- [Reading] Kullback-Leibler divergence
- [Coding tutorial] Minimising KL divergence
- [Reading] Full covariance Gaussian approximation
2
Videos
- Minimising KL divergence
- [Coding tutorial] Minimising KL divergence
Maximising the ELBO
1
Labs
- [Coding tutorial] Maximising the ELBO
2
Videos
- Maximising the ELBO
- [Coding tutorial] Maximising the ELBO
KL divergence layers
1
Labs
- [Coding tutorial] KL divergence layers
2
Videos
- KL divergence layers
- [Coding tutorial] KL divergence layers
Programming Assignment: Variational autoencoder for Celeb-A
- Variational autoencoder for Celeb-A
1
Labs
- Variational autoencoder for Celeb-A
1
Videos
- Wrap up and introduction to the programming assignment
Probabilistic generative models
1
Peer Review
- Capstone Project
1
Labs
- Capstone Project
2
Videos
- Welcome to the Capstone Project
- Goodbye video
Auto Summary
Explore the crucial domain of probabilistic deep learning with TensorFlow 2 in this expert-level course by Coursera. Delve into models that quantify uncertainty in real-world datasets, essential for applications like autonomous vehicles and medical diagnoses. Learn to develop probabilistic models using TensorFlow Probability, including Bayesian neural networks, normalizing flows, and variational autoencoders. Engage in hands-on coding tutorials and graded assignments, culminating in a Capstone Project. Ideal for learners with a solid foundation in probability and statistics. Subscription options include Starter, Professional, and Paid plans.

Dr Kevin Webster