- Level Professional
- المدة 31 ساعات hours
- الطبع بواسطة DeepLearning.AI
-
Offered by
عن
In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming, b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics, c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.الوحدات
Lecture: Autocorrect and Minimum Edit Distance
2
Labs
- Lecture notebook: Building the vocabulary
- Lecture notebook: Candidates from edits
11
Videos
- Intro to Course 2
- Week Introduction
- Overview
- Autocorrect
- Building the model
- Building the model II
- Minimum edit distance
- Minimum edit distance algorithm
- Minimum edit distance algorithm II
- Minimum edit distance algorithm III
- Week Conclusion
9
Readings
- Overview
- Autocorrect
- Building the model
- Building the model II
- Minimum edit distance
- Minimum edit distance algorithm
- Minimum edit distance algorithm II
- Minimum edit distance III
- [IMPORTANT] Have questions, issues or ideas? Join our Forum!
Lecture Notes (Optional)
1
Readings
- Lecture Notes W1
Quiz: Auto-correct and Minimum Edit Distance
1
Assignment
- Auto-correct and Minimum Edit Distance
Assignment: Autocorrect
- Autocorrect
1
Readings
- (Optional) Downloading your Notebook, Downloading your Workspace and Refreshing your Workspace
Lecture: Part of Speech Tagging
2
Labs
- Lecture Notebook - Working with text files
- Lecture Notebook - Working with tags and Numpy
13
Videos
- Week Introduction
- Part of Speech Tagging
- Markov Chains
- Markov Chains and POS Tags
- Hidden Markov Models
- Calculating Probabilities
- Populating the Transition Matrix
- Populating the Emission Matrix
- The Viterbi Algorithm
- Viterbi: Initialization
- Viterbi: Forward Pass
- Viterbi: Backward Pass
- Week Conclusion
11
Readings
- Part of Speech Tagging
- Markov Chains
- Markov Chains and POS Tags
- Hidden Markov Models
- Calculating Probabilities
- Populating the Transition Matrix
- Populating the Emission Matrix
- The Viterbi Algorithm
- Viterbi Initialization
- Viterbi: Forward Pass
- Viterbi: Backward Pass
Lecture Notes (Optional)
1
Readings
- Lecture Notes W2
Practice Quiz
1
Assignment
- Part of Speech Tagging
Assignment: Part of Speech Tagging
- Part of Speech Tagging
Lecture: Autocomplete
3
Labs
- Lecture notebook: Corpus preprocessing for N-grams
- Lecture notebook: Building the language model
- Lecture notebook: Language model generalization
11
Videos
- Week Introduction
- N-Grams: Overview
- N-grams and Probabilities
- Sequence Probabilities
- Starting and Ending Sentences
- The N-gram Language Model
- Language Model Evaluation
- Out of Vocabulary Words
- Smoothing
- Week Summary
- Week Conclusion
9
Readings
- N-Grams Overview
- N-grams and Probabilities
- Sequence Probabilities
- Starting and Ending Sentences
- The N-gram Language Model
- Language Model Evaluation
- Out of Vocabulary Words
- Smoothing
- Week Summary
Lecture Notes (Optional)
1
Readings
- Lecture Notes W3
Practice Quiz
1
Assignment
- Autocomplete
Assignment: Autocomplete
- Autocomplete
Lecture: Word Embeddings
5
Labs
- Lecture Notebook - Data Preparation
- Lecture Notebook - Intro to CBOW model
- Lecture Notebook - Training the CBOW model
- Lecture Notebook - Word Embeddings
- Lecture notebook: Word embeddings step by step
22
Videos
- Week Introduction
- Overview
- Basic Word Representations
- Word Embeddings
- How to Create Word Embeddings
- Word Embedding Methods
- Continuous Bag-of-Words Model
- Cleaning and Tokenization
- Sliding Window of Words in Python
- Transforming Words into Vectors
- Architecture of the CBOW Model
- Architecture of the CBOW Model: Dimensions
- Architecture of the CBOW Model: Dimensions 2
- Architecture of the CBOW Model: Activation Functions
- Training a CBOW Model: Cost Function
- Training a CBOW Model: Forward Propagation
- Training a CBOW Model: Backpropagation and Gradient Descent
- Extracting Word Embedding Vectors
- Evaluating Word Embeddings: Intrinsic Evaluation
- Evaluating Word Embeddings: Extrinsic Evaluation
- Conclusion
- Week Conclusion
20
Readings
- Overview
- Basic Word Representations
- Word Embeddings
- How to Create Word Embeddings?
- Word Embedding Methods
- Continuous Bag of Words Model
- Cleaning and Tokenization
- Sliding Window of words in Python
- Transforming Words into Vectors
- Architecture for the CBOW Model
- Architecture of the CBOW Model: Dimensions
- Architecture of the CBOW Model: Dimensions
- Architecture of the CBOW Model: Activation Functions
- Training a CBOW Model: Cost Function
- Training a CBOW Model: Forward Propagation
- Training a CBOW Model: Backpropagation and Gradient Descent
- Extracting Word Embedding Vectors
- Evaluating Word Embeddings: Intrinsic Evaluation
- Evaluating Word Embeddings: Extrinsic Evaluation
- Conclusion
Lecture Notes (Optional)
1
Readings
- Lecture Notes W4
Practice Quiz
1
Assignment
- Word Embeddings
End of access to Lab Notebooks
1
Readings
- [IMPORTANT] Reminder about end of access to Lab Notebooks
Assignment: Word Embeddings
- Word Embeddings
Acknowledgments
1
Readings
- Acknowledgments
Auto Summary
Explore "Natural Language Processing with Probabilistic Models," a professional-level course in Data Science & AI. Taught by Stanford's Younes Bensouda Mourri and Google's Łukasz Kaiser, this Coursera offering covers auto-correct algorithms, Viterbi Algorithm for POS tagging, N-gram language models, and Word2Vec. Ideal for learners aiming to master NLP applications like question-answering and sentiment analysis. Available through Starter, Professional, and Paid subscriptions, the course spans approximately 1860 minutes.

Younes Bensouda Mourri

Łukasz Kaiser