- Level Foundation
- Duration 26 hours
- Course by DeepLearning.AI
-
Offered by
About
After completing this course, learners will be able to: • Analytically optimize different types of functions commonly used in machine learning using properties of derivatives and gradients • Approximately optimize different types of functions commonly used in machine learning using first-order (gradient descent) and second-order (Newton’s method) iterative methods • Visually interpret differentiation of different types of functions commonly used in machine learning • Perform gradient descent in neural networks with different activation and cost functions Mathematics for Machine Learning and Data science is a foundational online program created by DeepLearning.AI and taught by Luis Serrano. This beginner-friendly program is where you’ll master the fundamental mathematics toolkit of machine learning. Many machine learning engineers and data scientists need help with mathematics, and even experienced practitioners can feel held back by a lack of math skills. This Specialization uses innovative pedagogy in mathematics to help you learn quickly and intuitively, with courses that use easy-to-follow plugins and visualizations to help you see how the math behind machine learning actually works. Upon completion, you’ll understand the mathematics behind all the most common algorithms and data analysis techniques — plus the know-how to incorporate them into your machine learning career. This is a beginner-friendly program, with a recommended background of at least high school mathematics. We also recommend a basic familiarity with Python, as labs use Python and Jupyter Notebooks to demonstrate learning objectives in the environment where they’re most applicable to machine learning and data science.Modules
Lesson 1 - Derivatives
1
Assignment
- Derivatives
21
Videos
- Course Introduction
- A note on programming experience
- Machine Learning Motivation
- Motivation to Derivatives - Part I
- Derivatives and Tangents
- Slopes, maxima and minima
- Derivatives and their notation
- Some common derivatives - Lines
- Some common Derivatives - Quadratics
- Some common derivatives - Higher degree polynomials
- Some common derivatives - Other power functions
- The inverse function and its derivative
- Derivative of trigonometric functions
- Meaning of the Exponential (e)
- The derivative of e^x
- The derivative of log(x)
- Existence of the derivative
- Properties of the derivative: Multiplication by scalars
- Properties of the derivative: The sum rule
- Properties of the derivative: The product rule
- Properties of the derivative: The chain rule
3
Readings
- Learning Python: Recommended Resources
- [IMPORTANT] Have questions, issues or ideas? Join our Forum!
- Approximation of Derivatives
Ungraded Lab
1
Labs
- Differentiation in Python: Symbolic, Numerical and Automatic
1
Readings
- (Optional) Downloading your Notebook and Refreshing your Workspace
Lesson 2 - Optimization
1
Assignment
- Derivatives and Optimization
6
Videos
- Introduction to optimization
- Optimization of squared loss - The one powerline problem
- Optimization of squared loss - The two powerline problem
- Optimization of squared loss - The three powerline problem
- Optimization of log-loss - Part 1
- Optimization of log-loss - Part 2
Programming Assignment: Optimizing Functions of One Variable: Cost Minimization
- Optimizing Functions of One Variable: Cost Minimization
2
Readings
- (Optional) Assignment Troubleshooting Tips
- (Optional) Partial Grading for Assignments
Week 1 Wrap Up
1
Videos
- Week 1 - Conclusion
1
Readings
- Week 1 - Slides
Lesson 1 - Gradients
1
Assignment
- Partial Derivatives and Gradient
7
Videos
- Introduction to Tangent planes
- Partial derivatives - Part 1
- Partial derivatives - Part 2
- Gradients
- Gradients and maxima/minima
- Optimization with gradients: An example
- Optimization using gradients - Analytical method
Lesson 2 - Gradient Descent
- Optimization Using Gradient Descent: Linear Regression
1
Assignment
- Partial Derivatives and Gradient Descent
2
Labs
- Optimization Using Gradient Descent in One Variable
- Optimization Using Gradient Descent in Two Variables
7
Videos
- Optimization using Gradient Descent in one variable - Part 1
- Optimization using Gradient Descent in one variable - Part 2
- Optimization using Gradient Descent in one variable - Part 3
- Optimization using Gradient Descent in two variables - Part 1
- Optimization using Gradient Descent in two variables - Part 2
- Optimization using Gradient Descent - Least squares
- Optimization using Gradient Descent - Least squares with multiple observations
Week 2 Wrap Up
1
Videos
- Week 2 - Conclusion
1
Readings
- Week 2 - Slides
Lesson 1 - Optimization in Neural Networks
1
Assignment
- Optimization in Neural Networks
2
Labs
- Regression with Perceptron
- Classification with Perceptron
10
Videos
- Regression with a perceptron
- Regression with a perceptron - Loss function
- Regression with a perceptron - Gradient Descent
- Classification with Perceptron
- Classification with Perceptron - The sigmoid function
- Classification with Perceptron - Gradient Descent
- Classification with Perceptron - Calculating the derivatives
- Classification with a Neural Network
- Classification with a Neural Network - Minimizing log-loss
- Gradient Descent and Backpropagation
Lesson 2 - Newton's Method
- Neural Network with Two Layers
1
Assignment
- Optimization in Neural Networks and Newton's Method
1
Labs
- Optimization Using Newton's Method
6
Videos
- Newton's Method
- Newton's Method: An example
- The second derivative
- The Hessian
- Hessians and concavity
- Newton's Method for two variables
1
Readings
- [IMPORTANT] Reminder about end of access to Lab Notebooks
Week 3 Wrap Up
1
Videos
- Week 3 - Conclusion
1
Readings
- Week 3 - Slides
Acknowledgments & Course Resources
2
Readings
- Acknowledgments
- (Optional) Opportunity to Mentor Other Learners
Auto Summary
Join "Calculus for Machine Learning and Data Science," a foundational course by DeepLearning.AI and instructor Luis Serrano. Perfect for those with basic to intermediate Python skills, this program dives into optimizing machine learning functions using derivatives, gradients, and iterative methods. With hands-on Python labs, you'll visually interpret and perform gradient descent in neural networks. Ideal for aspiring or current machine learning engineers and data scientists needing a math boost. Accessible via Coursera with Starter and Professional subscription options, this 1560-hour course ensures a comprehensive learning experience.

Luis Serrano