How about tf.contrib.layers.batch_norm ?
huangh12 opened this issue · comments
Hello,
Thanks for your tutorial, it helps me a lot to clarify the concept and usage of batch norm in TF. However, as you said,
In the present tf.layers API (TF1.3), there is no one-line syntax for a dense layer with batch norm and relu.
Have you ever considered tf.contrib.layers.batch_norm? It seems that this API can do all thing as a whole. But I am puzzled that there are not many people using this API...
What I meant is that you have to write two lines if you want a neural network layer with batch norm. I usually make a helper function and then it becomes just one line in my model.