example1 not enough training.
chriscamacho opened this issue · comments
with a value of 300 in the training loop I see this output:
Output for [0, 0] is 0.
Output for [0, 1] is 1.
Output for [1, 0] is 1.
Output for [1, 1] is 1.
changing the loop to 350 gives:
Output for [0, 0] is 0.
Output for [0, 1] is 1.
Output for [1, 0] is 1.
Output for [1, 1] is 0.
Was this done on purpose to show some kind of limitation of back propagation ?
Was this done on purpose to show some kind of limitation of back propagation ?
No. It's randomized, so it takes more loops sometimes. I'll increase it.
Actually, it's possible you got stuck in a local minima. If you change it back to 300 iterations, and run the program several times, how often does it work?
in 10 runs I only saw it get the right result 2 times (@ 300 iterations)
Ok, thanks for the feedback. I'll look into it and tune things a bit when I have some free time.
just as a point of interest when it does "fail" the outputs for case 2,3 and 4 are always similar around 0.6
Output for [0, 0] is 0.07 (0).
Output for [0, 1] is 0.64 (1).
Output for [1, 0] is 0.65 (1).
Output for [1, 1] is 0.62 (1).