Schedule

Note: Future schedule is subject to minor change. Please refer to Gradescope for HW due date.

Introduction

Representation of text

  • Week 2 (Sep 15). Text classification: bag-of-words, naive Bayes models, logistic regression

  • Week 3 (Sep 22). Distributed representation: vector space models, Brown clusters, neural word embeddings

Predicting sequences

  • Week 4 (Sep 29). Language models: n-gram LM, neural LM, perplexity

  • Week 5 (Oct 6). Sequence labeling: log-linear models, decoding, POS tagging

  • Week 6 (Oct 13). Hidden Markov models: HMM, EM

  • Week 7 (Oct 20). Midterm.

Predicting trees

Deep learning for NLP

Beyond text

  • Week 12 (Nov 24). Language grounding: language+vision/robotics, pragmatics, RL agents

Conclusion