MarvinBertin / Word2Vec

NLP - word embedding neural network

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Word2Vec Models for NLP

###The Word2Vec model is a simple word embedding neural network, developed by Mikolov et al. (2013)

Such continuous word embedding representations have have been proven to be able to carry semantic meanings and are useful in various NLP tasks

Binder

In this notebook, I have attempted to implement three language models described in Le & Mikolov (2014)'s paper Distributed Representations of Sentences and Documents.

The implementations don't make use of any NLP libraries and consist of the simplest form of the algorithm with little optimization. The aim of this notebook is simply to gain:

  • understanding of the language models' algorithm
  • intuition on word embedding representations
  • understanding of inner workings of neural networks

The main notebook can be viewed here: Word2Vec Notebook

About

NLP - word embedding neural network


Languages

Language:Jupyter Notebook 60.5%Language:Python 39.5%