Schedule¶
Note: Future schedule is subject to minor change.
Assignments¶
Representation of text¶
Predicting sequences¶
Week 4 (Sep 29). Language models: n-gram LM, neural LM, perplexity
Week 5 (Oct 6). Sequence labeling: log-linear models, decoding, POS tagging
Week 6 (Oct 13). Hidden Markov models: HMM, EM
Week 7 (Oct 20). Midterm.
Predicting trees¶
Week 8 (Oct 27). Context-free parsing: PCFG, CYK, neural parser
Week 9 (Nov 3). Semantic parsing: logical semantics, learning from logical forms / denotations
Deep learning for NLP¶
Week 10 (Nov 10). Neural sequence modeling: seq2seq, attention, copy mechanism, text generation
Week 11 (Nov 17). Representation learning: transformers, contextualized word embedding, pre-training and fine-tuning, autoencoders
Beyond text¶
Week 12 (Nov 24). Language grounding: language+vision/robotics, pragmatics, RL agents
Conclusion¶
Week 13 (Dec 1). Guest lecture by Nasrin Mostafazadeh: “How far have we come in giving our NLU systems common sense?”
Week 14 (Dec 8). Project presentations