###The Word2Vec model is a simple word embedding neural network, developed by Mikolov et al. (2013)
Such continuous word embedding representations have have been proven to be able to carry semantic meanings and are useful in various NLP tasks
In this notebook, I have attempted to implement three language models described in Le & Mikolov (2014)'s paper Distributed Representations of Sentences and Documents.
The implementations don't make use of any NLP libraries and consist of the simplest form of the algorithm with little optimization. The aim of this notebook is simply to gain:
- understanding of the language models' algorithm
- intuition on word embedding representations
- understanding of inner workings of neural networks
The main notebook can be viewed here: Word2Vec Notebook