Updating global variables without locking
GoogleCodeExporter opened this issue · comments
Google Code Exporter commented
Some of the global variables are updated in the function TrainModelThread
without the use of locking. Here is one of them:
word_count_actual += word_count - last_word_count;
Can someone please explain how this works.
Also, can someone please help me with the following question:
For negative sampling, why do we need a unigram table to choose the negative
samples from? Why can't we just choose random words from the vocabulary?
Sincerely,
Vishal
Original issue reported on code.google.com by vahu...@gmail.com
on 21 Jul 2015 at 6:26