gverdian / neurolab

Automatically exported from code.google.com/p/neurolab

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Linear Activation Leads to NaN minmax

GoogleCodeExporter opened this issue · comments

Slightly modified standard feed-forward example

x = np.linspace(-7,7,20)
y = np.sin(x) * .5
size = len(x)
inp = x.reshape(size,1)
tar = y.reshape(size,1)

net = nl.net.newff([[-7,7]], [5,1], transf=[nl.net.trans.PureLin()]*2)

Leads to infinite minmax in core.py:
self.init()   # line 97, minmax = [[-inf inf]]

Which leads to a problem in init.py, line 129/130
x = 2. / (minmax[:, 1] - minmax[:, 0])
y = 1. - minmax[:, 1] * x



Original issue reported on code.google.com by MLotst...@gmail.com on 9 Jun 2014 at 10:00

Fix in trunk. I made replacement -inf/inf to -1/1, but may be Nguyen-Widrow is 
not best method for init layers with linear actuator....
Thanks for your report

Original comment by zue...@gmail.com on 10 Jun 2014 at 6:06

Original comment by zue...@gmail.com on 10 Jun 2014 at 6:06

  • Changed state: Fixed