Lecture slides - https://web.stanford.edu/class/archive/cs/cs224n/cs224n.1184/syllabus.html
- Lecture 1 – Introduction and Word Vectors
- Lecture 2 – Word Vectors and Word Senses
- Lecture 3 – Neural Networks
- Lecture 4 – Backpropagation
- Lecture 5 - Dependency Parsing
- Lecture 6 – Language Models and RNNs
- Lecture 7 – Vanishing Gradients, Fancy RNNs
- Lecture 8 – Translation, Seq2Seq, Attention
- Lecture 9 – Practical Tips for Projects
- Lecture 10 - Question Answering
- Lecture 11 – Convolutional Networks for NLP
- Lecture 12 – Subword Models
- Lecture 13 – Contextual Word Embeddings
- Lecture 14 – Transformers and Self-Attention
- Lecture 15 – Natural Language Generation
- Lecture 16 – Coreference Resolution
- Lecture 17 – Multitask Learning
- Lecture 18 – Constituency Parsing, TreeRNNs
- Lecture 19 – Bias in AI
- Lecture 20 – Future of NLP + Deep Learning