martin-gorner / tensorflow-mnist-tutorial

Sample code for "Tensorflow and deep learning, without a PhD" presentation and code lab.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Python 2 compatibility

martin-gorner opened this issue · comments

fix division issue in learning rate decay computation on Python 2

Other possible solution will be to delegate decay rate computation to TensorFlow

global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(0.003, global_step, 100, 0.97, staircase=True)
train_op = tf.train.AdamOptimizer(learning_rate).minimize(loss, global_step=global_step)

Some result on training phase (3.0 convolutional)

Epoch: 0001 - cost = 0.317358517 - learning rate = 0.002576
Epoch: 0002 - cost = 0.068440056 - learning rate = 0.002212
Epoch: 0003 - cost = 0.044457262 - learning rate = 0.001843
Epoch: 0004 - cost = 0.030760335 - learning rate = 0.001582
Epoch: 0005 - cost = 0.021672835 - learning rate = 0.001318
Epoch: 0006 - cost = 0.014874582 - learning rate = 0.001132
Epoch: 0007 - cost = 0.012011148 - learning rate = 0.000943
Epoch: 0008 - cost = 0.006284746 - learning rate = 0.000810
Epoch: 0009 - cost = 0.004536705 - learning rate = 0.000674
Epoch: 0010 - cost = 0.003143254 - learning rate = 0.000579
Epoch: 0011 - cost = 0.001789587 - learning rate = 0.000482
Epoch: 0012 - cost = 0.001161188 - learning rate = 0.000414
Epoch: 0013 - cost = 0.000715020 - learning rate = 0.000345
Epoch: 0014 - cost = 0.000526037 - learning rate = 0.000296
Epoch: 0015 - cost = 0.000266390 - learning rate = 0.000247
Epoch: 0016 - cost = 0.000154369 - learning rate = 0.000212
Epoch: 0017 - cost = 0.000115868 - learning rate = 0.000177

Only thing you can't reproduce (I didn't find how, I mean) is to have a minimum learning rate

I didn't find how [] to have a minimum learning rate

Hmm, I hear the ancients knew of a secret way. The knowledge is lost now, only an arcane symbol reached us through time, its meaning shrouded in mystery: +

As in:
learning_rate_with_min = min + tf.train.exponential_decay(...)

;-)

I was totally focused on API signature and searching on doc that I forget the basic arithmetic... Good catch :)

Python 2 compatibility tested. Everything OK