Distributional semantics is one of the most important notions in contemporary computational linguistics and NLP: representations of meaning exploiting distributional semantics framework are used in almost any NLP system. Therefore, deep understanding of distributional semantics and models based on it is crucial for resolving cutting-edge NLP tasks. The main idea of this course is to give such understanding to the listener.
It is important to understand that this course is not about NLP and/or semantics in general. Many aspects crucial for a NLP (or semantics) course will be omitted, and this course is not recommended for a listener seeking for a good introductory NLP course. Instead, this course will give an exhaustive (and unique) introduction to a much more narrower NLP field called distributional semantics (in context of modern NLP techniques and linguistic theories).
-
Introduction (22.02)
-
Count-based Distributional Models (02.03)
-
Topic Modeling (09.03)
-
Prediction-based Distributional Models (16.03)
-
Lexical-level and Morphological-level Extensions of Word2Vec (23.03)
-
Evaluation of Distributional Semantic Models (30.03)
-
Multi-sense Word Embeddings (06.04)
-
From Words to Phrases, Sentences and Documents (13.04)
-
Cross-language Word Embeddings (20.04)
-
Recent Trends in Distributional Semantics (27.04)
-
Compositional Distributional Semantics (04.05)
-
Contextualized Word Embeddings (11.05)
-
Bridging Distributional Semantics and Neuroscience (18.05)
-
Conclusion (25.05)