Rababalkhalifa / nlp_course

YSDA course in Natural Language Processing

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

YSDA Natural Language Processing course Binder

  • This is the 2019 version. For previous year' course materials, go to this branch
  • Lecture and seminar materials for each week are in ./week* folders
  • YSDA homework deadlines will be listed in Anytask (read more).
  • Any technical issues, ideas, bugs in course materials, contribution ideas - add an issue
  • Installing libraries and troubleshooting: this thread.

Syllabus

  • week01 Embeddings

    • Lecture: Word embeddings. Distributional semantics, LSA, Word2Vec, GloVe. Why and when we need them.
    • Seminar: Playing with word and sentence embeddings.
  • week02 Text classification

    • Lecture: Text classification. Classical approaches for text representation: BOW, TF-IDF. Neural approaches: embeddings, convolutions, RNNs
    • Seminar: Salary prediction with convolutional neural networks; explaining network predictions.
  • week03 Language Models

    • Lecture: Language models: N-gram and neural approaches; visualizing trained models
    • Seminar: Generating ArXiv papers with language models
  • week04 Seq2seq/Attention

    • Lecture: Seq2seq: encoder-decoder framework. Attention: Bahdanau model. Self-attention, Transformer. Analysis of attention heads in Transformer.
    • Seminar: Machine translation of hotel and hostel descriptions
  • week05 Expectation-Maximization

    • Lecture: Expectation-Maximization and Hidden Markov Models
    • Seminar: Implementing expectation maximization
  • week06 Machine Translation

    • Lecture: Word Alignment Models, Noisy Channel, Machine Translation.
    • Seminar: Introduction to word alignment assignment.

Contributors & course staff

Course materials and teaching performed by

About

YSDA course in Natural Language Processing

License:MIT License


Languages

Language:Jupyter Notebook 80.6%Language:Python 12.7%Language:HTML 6.2%Language:Dockerfile 0.5%Language:Shell 0.0%