-
Notifications
You must be signed in to change notification settings - Fork 361
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Strange behavior when calculating Layers (probably Aparapi related) #50
Comments
A few questions.
Are you able to run any of the sample Aparapi apps on this machine?
It is not clear from this whether your code is GPU ish ;) Specifically the
The code above is full of Java'isms that I would never expect to run on a
GPU.
At the core of your code I assume you have a 'kernel' which you expect to
run. Can you set the mode to 'JTP' or even 'SEQ' to see if the code runs
on it's own. Sans GPU/OpenCL.
Do you know that OpenCL will run on this machine..
Gary
…On Mon, Jun 19, 2017 at 6:26 PM, Wandemberg Gibaut ***@***.*** > wrote:
Hi!
First of all I have no experience with GPU processing and my personal
computer don't even have one.
I'm using your package inside a Cognitive Architecture project and at some
point its involve calculate some inputs in a loop. There I call a self-made
method that uses some lines from the "propagateForward".
The method is:
public void calculate(Matrix input){ Set<Layer> calculatedLayers = new
UniqueList<Layer>(); calculatedLayers.add(mlp.getInputLayer());
activations.addValues(mlp.getInputLayer(), input);
mlp.getLayerCalculator().calculate(mlp, mlp.getOutputLayer(),
calculatedLayers, activations); }
And I call it with "TrainingInputProvider.getNextInput().getInput()" as
parameter.
The problem is that after the first iteration (which seens to run without
any issue) this "calculate" method thows a error:
Exception in thread "Thread-7" java.lang.UnsatisfiedLinkError:
com.amd.aparapi.KernelRunner.runKernelJNI(JLcom/amd/aparapi/Range;ZI)I
and them the Thread is gone.
I feel that I'm doing something wrong as I think that it should OR work
though all the loop OR not work at all.
Can you help me with this ?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#50>, or mute the
thread
<https://github.com/notifications/unsubscribe-auth/AEKiN6eVWMMYwEEgokscHa3XQVNvrPFXks5sFx-7gaJpZM4N--bn>
.
|
In the Lab Computer I was able to run the Aparapi samples perfectly, but I didn't try on my PC. Thanks for your help, Gary! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi!
First of all I have no experience with GPU processing and my personal computer don't even have one.
I'm using your package inside a Cognitive Architecture project and at some point its involve calculate some inputs in a loop. There I call a self-made method that uses some lines from the "propagateForward".
The method is:
public void calculate(Matrix input){ Set<Layer> calculatedLayers = new UniqueList<Layer>(); calculatedLayers.add(mlp.getInputLayer()); activations.addValues(mlp.getInputLayer(), input); mlp.getLayerCalculator().calculate(mlp, mlp.getOutputLayer(), calculatedLayers, activations); }
And I call it with "TrainingInputProvider.getNextInput().getInput()" as parameter.
The problem is that after the first iteration (which seens to run without any issue) this "calculate" method thows a error:
Exception in thread "Thread-7" java.lang.UnsatisfiedLinkError: com.amd.aparapi.KernelRunner.runKernelJNI(JLcom/amd/aparapi/Range;ZI)I
and them the Thread is gone.
I feel that I'm doing something wrong as I think that it should OR work though all the loop OR not work at all.
Can you help me with this ?
The text was updated successfully, but these errors were encountered: