Leaky ReLU: Different defaults for alpha between darknet and TF
sjain-stanford opened this issue · comments
Defaults for leaky ReLU alpha
:
- Darknet uses 0.1
- TensorFlow uses 0.2
DW2TF code at the moment doesn't specify alpha, hence they'd mismatch.
PR on the way for this fix and few other enhancements.
Same goes for BatchNorm's epsilon parameter.
- Darknet uses 1e-5
- TensorFlow uses 1e-3 by default
Feel free to close after merging #4. Thanks.