- Level Professional
- المدة 35 ساعات hours
- الطبع بواسطة DeepLearning.AI
-
Offered by
عن
In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you've completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.الوحدات
Neural Machine Translation
1
External Tool
- [IMPORTANT] Have questions, issues or ideas? Join our Community!
3
Labs
- Ungraded Lab: Basic Attention
- Ungraded Lab: Scaled Dot-Product Attention
- Ungraded Lab: BLEU Score
14
Videos
- Course 4 Introduction
- Week Introduction
- Seq2seq
- Seq2seq Model with Attention
- Queries, Keys, Values, and Attention
- Setup for Machine Translation
- Teacher Forcing
- NMT Model with Attention
- BLEU Score
- ROUGE-N Score
- Sampling and Decoding
- Beam Search
- Minimum Bayes Risk
- Week Conclusion
2
Readings
- Background on seq2seq
- Content Resource
Lecture Notes (Notes)
1
Readings
- Lecture Notes W1
Practice Quiz
1
Assignment
- Neural Machine Translation
Assignment
- NMT with Attention (Tensorflow)
1
Readings
- (Optional) Downloading your Notebook, Downloading your Workspace and Refreshing your Workspace
Heroes of NLP: Oren Etzioni
1
Videos
- Andrew Ng with Oren Etzioni
Text Summarization
3
Labs
- Attention
- Masking
- Positional encoding
10
Videos
- Week Introduction
- Transformers vs RNNs
- Transformers overview
- Transformer Applications
- Scaled and Dot-Product Attention
- Masked Self Attention
- Multi-head Attention
- Transformer Decoder
- Transformer Summarizer
- Week Conclusion
5
Readings
- Transformers vs RNNs
- Transformer Applications
- Multi-head Attention
- Transformer Decoder
- Content Resource
Lecture Notes (Optional)
1
Readings
- Lecture Notes W2
Practice Quiz
1
Assignment
- Text Summarization
Assignment
- Transformer Summarizer
Question Answering
1
Labs
- SentencePiece and BPE
10
Videos
- Week Introduction
- Week 3 Overview
- Transfer Learning in NLP
- ELMo, GPT, BERT, T5
- Bidirectional Encoder Representations from Transformers (BERT)
- BERT Objective
- Fine tuning BERT
- Transformer: T5
- Multi-Task Training Strategy
- GLUE Benchmark
9
Readings
- Week 3 Overview
- Transfer Learning in NLP
- ELMo, GPT, BERT, T5
- Bidirectional Encoder Representations from Transformers (BERT)
- BERT Objective
- Fine tuning BERT
- Transformer T5
- Multi-Task Training Strategy
- GLUE Benchmark
Hugging Face
2
Labs
- Question Answering with HuggingFace - Using a base model
- Question Answering with HuggingFace 2 - Fine-tuning a model
5
Videos
- Hugging Face Introduction
- Hugging Face I
- Hugging Face II
- Hugging Face III
- Week Conclusion
2
Readings
- Welcome to Hugging Face 🤗
- Content Resource
Lecture Notes (Optional)
1
Readings
- Lecture Notes W3
Practice Quiz
1
Assignment
- Question Answering
Assignment
- Question Answering
Heroes of NLP: Quoc Le
1
Videos
- Andrew Ng with Quoc Le
Acknowledgments & Course Resources
3
Readings
- Acknowledgments
- References
- (Optional) Opportunity to Mentor Other Learners
Auto Summary
"Natural Language Processing with Attention Models" is an advanced course in Data Science and AI that delves into the intricacies of NLP using cutting-edge attention models. This course is the fourth installment in the Natural Language Processing Specialization and is expertly designed to help learners: - Translate English sentences into German using an encoder-decoder attention model. - Construct a Transformer model for text summarization. - Utilize T5 and BERT models for question-answering tasks. - Develop a chatbot with a Reformer model. Upon completion, participants will have mastered the creation of sophisticated NLP applications, including tools for question-answering, sentiment analysis, language translation, text summarization, and chatbot development. The course is taught by two distinguished experts: Younes Bensouda Mourri, an AI Instructor at Stanford University, and Łukasz Kaiser, a Staff Research Scientist at Google Brain and co-author of significant contributions to the field like TensorFlow and the Transformer paper. Ideal for professionals with a solid foundation in machine learning, intermediate Python skills, and knowledge of deep learning frameworks (such as TensorFlow or Keras), this course also requires proficiency in calculus, linear algebra, and statistics. Completion of the preceding course, "Natural Language Processing with Sequence Models," is recommended. Offered through Coursera, this comprehensive program spans 2100 minutes of immersive learning and is available under a Starter subscription plan. It is tailored for those ready to elevate their expertise in NLP to a professional level.

Younes Bensouda Mourri

Łukasz Kaiser