gverdian / neurolab

Automatically exported from code.google.com/p/neurolab

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

competitive transfer function produces incorrect output

GoogleCodeExporter opened this issue · comments

What steps will reproduce the problem?
1. Import neurolab as nl and create a simple vector n with 3 values, one value 
< 0, one value > 0 and one value < 0, e.g. n = (-0.5763, 0.8345, -0.1234)
2. let f = nl.trans.Competitive() 
3. a = f(n)

What is the expected output? What do you see instead?
I expect [0, 1, 0], instead I see [1, 0, 0 ]

What version of the product are you using? On what operating system?
Version 0.1.0 on Ubuntu 11.04

Please provide any additional information below.


Original issue reported on code.google.com by ch...@trusc.net on 5 Aug 2011 at 7:21

See nl.trans.Competitive.__doc__:
:Returns:
        y : ndarray
            may take the following values: 0, 1
            '1' if is a MINIMAL element of x, else '0'
    :Example:
        >>> f = Competitive()
        >>> f([-5, -0.1, 0, 0.1, 100])
        array([ 1.,  0.,  0.,  0.,  0.])

Original comment by zue...@gmail.com on 6 Aug 2011 at 7:36

Original comment by zue...@gmail.com on 7 Aug 2011 at 10:35

  • Changed state: Invalid
The MATLAB compet function returns '1' in the position of the maximal 
element of x.

Regards,

Chris de Villiers
Consulting Electronics Engineer

Trusc Technologies (Pty) Ltd.
PO Box 902
Vredendal
8160

Tel: +27 (0) 27 213 3878
Fax: +27 (0) 86 695 5578
Mobile: +27 (0) 82 895 4699

Original comment by ch...@trusc.net on 8 Aug 2011 at 6:23

Thanks for you interested of neurolab

Original comment by zue...@gmail.com on 8 Aug 2011 at 12:40

I find neurolab very useful. It was a trivial matter to change the code of the 
Competitive function so that it now gives the same output as the MATLAB compet 
function.

Original comment by ch...@trusc.net on 10 Aug 2011 at 6:05

Hello Zuev

Please could you give me some idea of how to use neurolab to duplicate 
the pattern recognition demo in MATLAB, where the network is created as
net = newff(alphabet,targets,10,{'logsig','logsig'}), where alphabet is 
a 35 x 26 array, and targets is a 26 x 26 array. How would you set this 
up in neurolab?


Regards,

Chris de Villiers
Electronics R&D Engineer

Trusc Technologies (Pty) Ltd.
PO Box 902
Vredendal
8160

Tel: +27 (0) 27 213 3878
Fax: +27 (0) 86 695 5578
Mobile: +27 (0) 82 895 4699

Original comment by ch...@trusc.net on 23 Aug 2011 at 12:09

Hello Chris

I used older version NNT (4.0.2 (R13)), may now  API has changed.

If I understand you correctly, you need something it:

import neurolab as nl
import numpy as np

# example patterns: 26 letters with 35 points on each letter
i = np.random.rand(26, 35)
t = np.random.rand(26, 26)
# create network wits 2 layers
# with 35 inputs (5*7) and 26 outputs
net = nl.net.newff([[0,1]]*35, [10, 26], [nl.trans.LogSig()]*2)
net.trainf = nl.train.train_bfgs

net.train(i, t, show=10, epochs=100)

Original comment by zue...@gmail.com on 23 Aug 2011 at 2:23

Thanks for the feedback, I'll try it.

Original comment by ch...@trusc.net on 24 Aug 2011 at 9:12

Hello Zuev

I got it to work, but I can't simulate with only one letter. It would 
appear that I need to simulate with the entire alphabet. I get an error 
when I try to input one letter (35x1 vector). I expect output to be 26x1 
vector with a 1 in the letter position and 0 everywhere else. Am I doing 
something wrong? This is possible in MATLAB.

Original comment by ch...@trusc.net on 24 Aug 2011 at 12:59

letter = np.empty(35)

print net.sim([letter])
# or
print net.step(letter)

Original comment by zue...@gmail.com on 24 Aug 2011 at 1:35

Thanks. I didn't realize I had to give sim([x]) instead of sim(x).

Any idea why I get an exp overflow warning with large epoch (e.g. epoch 
= 200)?

Original comment by ch...@trusc.net on 25 Aug 2011 at 6:09

I dont know. For help you I need a running code.

I create group: http://groups.google.com/group/py-neurolab
Please ask you questions there

Original comment by zue...@gmail.com on 25 Aug 2011 at 5:25