bakarov / dissemantics

A course on distributional semantics at Novosibirsk State University (NSU), Spring 2019

Home Page:https://t.me/disemantics

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Description

Distributional semantics is one of the most important notions in contemporary computational linguistics and NLP: representations of meaning exploiting distributional semantics framework are used in almost any NLP system. Therefore, deep understanding of distributional semantics and models based on it is crucial for resolving cutting-edge NLP tasks. The main idea of this course is to give such understanding to the listener.

It is important to understand that this course is not about NLP and/or semantics in general. Many aspects crucial for a NLP (or semantics) course will be omitted, and this course is not recommended for a listener seeking for a good introductory NLP course. Instead, this course will give an exhaustive (and unique) introduction to a much more narrower NLP field called distributional semantics (in context of modern NLP techniques and linguistic theories).

Syllabus

  1. Introduction (22.02)

  2. Count-based Distributional Models (02.03)

  3. Topic Modeling (09.03)

  4. Prediction-based Distributional Models (16.03)

  5. Lexical-level and Morphological-level Extensions of Word2Vec (23.03)

  6. Evaluation of Distributional Semantic Models (30.03)

  7. Multi-sense Word Embeddings (06.04)

  8. From Words to Phrases, Sentences and Documents (13.04)

  9. Cross-language Word Embeddings (20.04)

  10. Recent Trends in Distributional Semantics (27.04)

  11. Compositional Distributional Semantics (04.05)

  12. Contextualized Word Embeddings (11.05)

  13. Bridging Distributional Semantics and Neuroscience (18.05)

  14. Conclusion (25.05)

About

A course on distributional semantics at Novosibirsk State University (NSU), Spring 2019

https://t.me/disemantics


Languages

Language:Jupyter Notebook 100.0%