Lecture notes
Quick search
code
Show Source
Gradescope Piazza NYUClasses
Natural Language Processing
Table Of Contents
  • Schedule
  • Coursework
  • Lecture notes
    • 1. Overview
    • 2. Basic machine learning
    • 3. Text classification
    • 4. Distributed word representations
    • 5. Language models
    • 6. Sequence labeling
Natural Language Processing
Table Of Contents
  • Schedule
  • Coursework
  • Lecture notes
    • 1. Overview
    • 2. Basic machine learning
    • 3. Text classification
    • 4. Distributed word representations
    • 5. Language models
    • 6. Sequence labeling

Lecture notesΒΆ

  • 1. Overview
    • 1.1. A brief history
    • 1.2. Challenges in NLP
    • 1.3. Course overview
    • 1.4. Additional readings
  • 2. Basic machine learning
    • 2.1. Modeling, learning, inference
    • 2.2. Loss functions and optimization
    • 2.3. Summary
  • 3. Text classification
    • 3.1. An intuitive approach
    • 3.2. Naive Bayes model
    • 3.3. Maximum likelihood estimation
    • 3.4. Logistic regression
    • 3.5. Bag-of-words (BoW) representation
    • 3.6. Feature extractor
    • 3.7. Evaluation
    • 3.8. Additional readings
  • 4. Distributed word representations
    • 4.1. Vector-space models
    • 4.2. Learning word embeddings
    • 4.3. Brown clusters
    • 4.4. Evaluation
    • 4.5. Additional readings
  • 5. Language models
    • 5.1. N-gram language models
    • 5.2. Neural language models
    • 5.3. Evaluation
    • 5.4. Additional reading
  • 6. Sequence labeling
    • 6.1. A multiclass classification approach
    • 6.2. Structrured prediction
    • 6.3. Neural sequence labeling
    • 6.4. Applications
    • 6.5. Additional reading
Previous
Coursework
Next
1. Overview