Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Maximum size of the data #10

Closed
flefeb opened this issue Jun 6, 2016 · 6 comments
Closed

Maximum size of the data #10

flefeb opened this issue Jun 6, 2016 · 6 comments

Comments

@flefeb
Copy link

flefeb commented Jun 6, 2016

Hi
I try to train a multilayer perceptron network with 1 one hidden layer. The number of neurons is 256 in the input layer, 25 in the hidden and 2 in the output.
The perform_training function crashes in the dot function (levenberg_marquardt_algorithm.cpp) :
JacobianT_dot_Jacobian = terms_Jacobian.calculate_transpose().dot(terms_Jacobian);
because it tries to allocate a vector of 6477*6477 values (6477 is the parameters_number, roughly the total number of connexions in the network).
My question is : is it possible to train a network with 256 inputs using openNN ? If yes, how should the parameters be settled to avoid this crash ?
Thank you

@FernandoGomezP
Copy link
Contributor

The problem is the memory size of your computer. You can try the Quasi-Newton method, this one needs less memory and it is faster for a problem like yours

@flefeb
Copy link
Author

flefeb commented Jun 9, 2016

Thank you for your answer.

My computer is 16Go RAM, I suppose it should be enough. I tried on
another computer but got the same problem, even using the quasi-newton
method.

Le 06/06/2016 à 18:59, FernandoGomezP a écrit :

The problem is the memory size of your computer. You can try the
Quasi-Newton method, this one needs less memory and it is faster for a
problem like yours


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
#10 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AS3P1MsQsRPU7PDkKSOpGmoWXTQAcNIHks5qJFIAgaJpZM4Iu-dP.

Françoise LEFEBVRE
IMNC (Imagerie et Modélisation en Neurobiologie et Cancérologie)
Campus d'Orsay - Bât 440
91405 ORSAY Cedex
01 69 15 51 87

@FernandoGomezP
Copy link
Contributor

It is enough, we have loaded data sets with greater number of variables, maybe the problem is the number of instances. OpenNN loads the entire data set and it should not let memory for the elements of the training.

What is the size of the data file? OpenNN has been tested with a computer with the same RAM, and we were able to load a data file of 3Gb.

@flefeb
Copy link
Author

flefeb commented Jun 9, 2016

The output is a 2D position. For the tests, I used only 4 different
positions and 40 samples of 256 inputs each per position.

Le 09/06/2016 à 13:54, FernandoGomezP a écrit :

It is enough, we have loaded data sets with greater number of
variables, maybe the problem is the number of instances. OpenNN loads
the entire data set and it should not let memory for the elements of
the training.

What is the size of the data file? OpenNN has been tested with a
computer with the same RAM, and we were able to load a data file of 3Gb.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
#10 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AS3P1M550DvqpK4DRKTYRj3tDfe6i1s1ks5qJ_71gaJpZM4Iu-dP.

Françoise LEFEBVRE
IMNC (Imagerie et Modélisation en Neurobiologie et Cancérologie)
Campus d'Orsay - Bât 440
91405 ORSAY Cedex
01 69 15 51 87

@FernandoGomezP
Copy link
Contributor

FernandoGomezP commented Jun 15, 2016

It is a simple data set and OpenNN should load it. If you want send me that data set and we will test it in our computer. My mail is fernandogomez@artelnics.com.

@flefeb
Copy link
Author

flefeb commented Jun 21, 2016

I tried on another computer and the learning step is OK despite some
error messages in debug mode. I continue using the other computer and I
will ask you again if I hit other troubles

Thank you

Le 09/06/2016 à 13:54, FernandoGomezP a écrit :

It is enough, we have loaded data sets with greater number of
variables, maybe the problem is the number of instances. OpenNN loads
the entire data set and it should not let memory for the elements of
the training.

What is the size of the data file? OpenNN has been tested with a
computer with the same RAM, and we were able to load a data file of 3Gb.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
#10 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AS3P1M550DvqpK4DRKTYRj3tDfe6i1s1ks5qJ_71gaJpZM4Iu-dP.

Françoise LEFEBVRE
IMNC (Imagerie et Modélisation en Neurobiologie et Cancérologie)
Campus d'Orsay - Bât 440
91405 ORSAY Cedex
01 69 15 51 87

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants