jroznerski / NLP_word_embeddings

In this exercise, I will add the use of word embedding, or vector representations of words, to the experiments. I will use the trained models Word2Vec and Glove. Word2vec attempts to capture the co-occurrence of words in one window at a time, while Glove is based on the co-occurrence of words throughout the corpus.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This repository is not active

About

In this exercise, I will add the use of word embedding, or vector representations of words, to the experiments. I will use the trained models Word2Vec and Glove. Word2vec attempts to capture the co-occurrence of words in one window at a time, while Glove is based on the co-occurrence of words throughout the corpus.


Languages

Language:Jupyter Notebook 100.0%