- Level Professional
- المدة 24 ساعات hours
- الطبع بواسطة DeepLearning.AI
-
Offered by
عن
In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI.الوحدات
Setting up your Machine Learning Application
3
Videos
- Train / Dev / Test sets
- Bias / Variance
- Basic Recipe for Machine Learning
Connect with your Mentors and Fellow Learners on our Forum!
1
Readings
- [IMPORTANT] Have questions, issues or ideas? Join our Forum!
Regularizing your Neural Network
5
Videos
- Regularization
- Why Regularization Reduces Overfitting?
- Dropout Regularization
- Understanding Dropout
- Other Regularization Methods
2
Readings
- Clarification about Upcoming Regularization Video
- Clarification about Upcoming Understanding Dropout Video
Setting Up your Optimization Problem
6
Videos
- Normalizing Inputs
- Vanishing / Exploding Gradients
- Weight Initialization for Deep Networks
- Numerical Approximation of Gradients
- Gradient Checking
- Gradient Checking Implementation Notes
Lecture Notes (Optional)
1
Readings
- Lecture Notes W1
Quiz
1
Assignment
- Practical aspects of Deep Learning
Programming Assignments
- Initialization
- Regularization
- Gradient Checking
1
Readings
- (Optional) Downloading your Notebook, Downloading your Workspace and Refreshing your Workspace
Heroes of Deep Learning (Optional)
1
Videos
- Yoshua Bengio Interview
Optimization Algorithms
10
Videos
- Mini-batch Gradient Descent
- Understanding Mini-batch Gradient Descent
- Exponentially Weighted Averages
- Understanding Exponentially Weighted Averages
- Bias Correction in Exponentially Weighted Averages
- Gradient Descent with Momentum
- RMSprop
- Adam Optimization Algorithm
- Learning Rate Decay
- The Problem of Local Optima
2
Readings
- Clarification about Upcoming Adam Optimization Video
- Clarification about Learning Rate Decay Video
Lecture Notes (Optional)
1
Readings
- Lecture Notes W2
Quiz
1
Assignment
- Optimization Algorithms
Programming Assignment
- Optimization Methods
Heroes of Deep Learning (Optional)
1
Videos
- Yuanqing Lin Interview
Hyperparameter Tuning
3
Videos
- Tuning Process
- Using an Appropriate Scale to pick Hyperparameters
- Hyperparameters Tuning in Practice: Pandas vs. Caviar
Batch Normalization
4
Videos
- Normalizing Activations in a Network
- Fitting Batch Norm into a Neural Network
- Why does Batch Norm work?
- Batch Norm at Test Time
1
Readings
- Clarification about Upcoming Normalizing Activations in a Network Video
Multi-class Classification
2
Videos
- Softmax Regression
- Training a Softmax Classifier
1
Readings
- Clarifications about Upcoming Softmax Video
Introduction to Programming Frameworks
2
Videos
- Deep Learning Frameworks
- TensorFlow
1
Readings
- (Optional) Learn about Gradient Tape and More
Lecture Notes (Optional)
1
Readings
- Lecture Notes W3
Quiz
1
Assignment
- Hyperparameter tuning, Batch Normalization, Programming Frameworks
End of access to Lab Notebooks
1
Readings
- [IMPORTANT] Reminder about end of access to Lab Notebooks
Programming Assignment
- TensorFlow Introduction
References & Acknowledgments
2
Readings
- References
- Acknowledgments
Auto Summary
Unlock the secrets of deep learning with this comprehensive course on hyperparameter tuning, regularization, and optimization. Led by expert instructors from Coursera, this professional-level program dives into neural network techniques, optimization algorithms, and practical implementation in TensorFlow. Perfect for data science and AI enthusiasts, the course spans 1440 minutes and is available with a Starter subscription. Join now to elevate your technical skills and contribute to cutting-edge AI technology.

Andrew Ng

Kian Katanforoosh

Younes Bensouda Mourri