nitishsrivastava / deepnet

Implementation of some deep learning algorithms.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bug using dropout with eigenmat

jormansa opened this issue · comments

There is a bug in the function "r4_uni" from the file "ziggurat.cc" at line 388.

  • - - - value = fmod ( 0.5 + ( float ) ( jsr_input + *jsr ) / 65536.0 / 65536.0, 1.0 );

++++ value = (float) fmod ( 0.5 + ( double ) ( jsr_input + *jsr ) / 65536.0 / 65536.0, 1.0 );

The cast must be double, otherwise "value" is always set to 0.5.
This bug affects to the python function "fill_with_rand", that is used in "neuralnet.py" at line 194 when using dropout.