InvalidArgumentError, Found Inf or NaN gradient(global norm).
CodeXiaoLingYun opened this issue · comments
NO,I am seeing the same error. I also used the same function(tf.clip_by_global_norm),but I found learning rate and function are not the key reasons. when i generate Vocab,i set the size is 4682,and the vocab_size is 4682 in train.py,too. as the same,i do not know whether decrease the batch size is useful.
I look at an answer that it might be related to the vanishing/exploding gradient? I do not have any methods.
Have you solved the mistake