Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Endless repeat 'no more nodes left to remove!' #32

Closed
momire opened this issue Jun 24, 2017 · 4 comments
Closed

Endless repeat 'no more nodes left to remove!' #32

momire opened this issue Jun 24, 2017 · 4 comments

Comments

@momire
Copy link

momire commented Jun 24, 2017

hello. I used the evolve method to test the mnist task. I referenced Tutorials in the evolution section and https://blog.webkid.io/neural-networks-in-javascript/ articles in this link.

var mnist = require('mnist');
var neataptic = require('neataptic');

var set = mnist.set(700,20);
var trainset = set.training;
var testset = set.test

var Network = neataptic.Network;
var evolve = neataptic.evolve;

var mynetwork = new Network(784,10);
mynetwork.evolve(trainset,{});

The results were not good. I waited 20 hours to see the training of the network complete. But 'no more nodes left to remove!' Only the message that was repeated infinitely.

No more nodes left to remove!
No more connections to be made!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
No more nodes left to remove!
...

So this time, I tried using the train method instead of using the evolve method. Then the training was completed in just one minute. The test results were also good.

I can't know what's wrong. The same code worked well in the XOR example in evolution tutorial. The only difference is that the amount of data and the number of nodes are very little increased. Even that worked fine in the train method.

(Oh, while I was writing this question, now the results of the second attempt came out.

"CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory." 

He is dead. lol)

.

Obviously there will be something I missed about the evolve method. But I don't know what I do not know.

@alancnet
Copy link
Contributor

I agree warnings should be disabled by default, but there is a simple way to disable them:

require('neataptic').Config.warnings = false

@wagenaartje
Copy link
Owner

wagenaartje commented Jun 26, 2017

@momire the MNIST dataset is quite a big (and a little more complex) dataset. Calling the evolve() function won't necessarily converge, and it's very time-expensive due to the networks being very big (~800 nodes). For me, one generation with 60 genomes takes about 20 seconds. (PS: i'm working on speeding up evolution drastically using webworkers).

However, it is definitely possible to evolve a network to work on the MNIST dataset, it just requires the use of various options. I'll be posting an example soon, but in the mean time, use the option { log: 1} and you will see the network improving.

@alancnet I think I might create an extra option that disables warnings in functions that weren't called by the user. So warnings stay enabled when the user calles mutate() but not when it happens from the evolve() function.

@wagenaartje
Copy link
Owner

wagenaartje commented Jun 26, 2017

So a small example of training options which seems to work quite well for me (but still takes super long):

var mnist = require('mnist');
var neataptic = require('neataptic');

var set = mnist.set(700,20);
var trainingSet = set.training;
var testset = set.test;

var network= new neataptic.Network(784,10);

for(var i = 0; i < network.connections.length; i++){
  network.connections[i].weight = Math.random() * .02 - .01;
}

var results = network.evolve(trainingSet, {
  iterations: 150, // just so it stopped after a while
  elitism: 5, // 10%
  mutationRate: 0.7, // fairly high, yes
  equal: true,
  growth: 0, // this can create very large networks, you can also remove this 
  log: 1 // see how well it's doing
});

As you might have noticed, I have a custom weight initialisation algorithm. Neataptic doesn't take into account that 784 connections cause any (sigmoid) connected neuron to activate 1 with default weight init. I already have a clean weight initialisation process in my head which i'm coding now.

If you need more help with evolving the MNIST set let me know.

PS: I think the MNIST dataset works better with a backprop + evolve technique, which I might incorperate in the future.

@momire
Copy link
Author

momire commented Jun 27, 2017

I tried your advice and it worked!
Thanks for your help.
(Unfortunately, I don't know how to use Web Walker for my skill)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants