ivan-vasilev / neuralnetworks

java deep learning algorithms and deep neural networks with gpu acceleration

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Strange behavior when calculating Layers (probably Aparapi related)

wandgibaut opened this issue · comments

Hi!
First of all I have no experience with GPU processing and my personal computer don't even have one.
I'm using your package inside a Cognitive Architecture project and at some point its involve calculate some inputs in a loop. There I call a self-made method that uses some lines from the "propagateForward".

The method is:
public void calculate(Matrix input){ Set<Layer> calculatedLayers = new UniqueList<Layer>(); calculatedLayers.add(mlp.getInputLayer()); activations.addValues(mlp.getInputLayer(), input); mlp.getLayerCalculator().calculate(mlp, mlp.getOutputLayer(), calculatedLayers, activations); }
And I call it with "TrainingInputProvider.getNextInput().getInput()" as parameter.

The problem is that after the first iteration (which seens to run without any issue) this "calculate" method thows a error:

Exception in thread "Thread-7" java.lang.UnsatisfiedLinkError: com.amd.aparapi.KernelRunner.runKernelJNI(JLcom/amd/aparapi/Range;ZI)I

and them the Thread is gone.

I feel that I'm doing something wrong as I think that it should OR work though all the loop OR not work at all.

Can you help me with this ?

In the Lab Computer I was able to run the Aparapi samples perfectly, but I didn't try on my PC.
And yes, the problem was on this setup on it. Otherwise, It runs ok without GPU.

Thanks for your help, Gary!