nfmcclure / tensorflow_cookbook

Code for Tensorflow Machine Learning Cookbook

Home Page:https://www.packtpub.com/big-data-and-business-intelligence/tensorflow-machine-learning-cookbook-second-edition

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

05_Working_With_CBOW_Embeddings,loss is always nan

wuyongdec opened this issue · comments

when I run the demo,output is:

print('Loss at step {} : {}'.format(i+1, loss_val))

Loading Data
Normalizing Text Data
Creating Dictionary
Creating Model
Starting Training
Loss at step 100 : nan
Loss at step 200 : nan
Loss at step 300 : nan
Loss at step 400 : nan
Loss at step 500 : nan
Loss at step 600 : nan
Loss at step 700 : nan
Loss at step 800 : nan
Loss at step 900 : nan
Loss at step 1000 : nan
commented

Thanks for bringing this to my attention. I'm currently working on updating this chapter this week. Hopefully a fix will come out soon.

commented

I'm about to push a fix. I have no idea why this is. But if you lower the learning rate to 0.25 it seems to work.

Change:
model_learning_rate = 0.25

You may also consider adding:
tf.set_random_seed(42)

np.random.seed(42)

I got it to run that way. Look for the code update later today.