New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Endless repeat 'no more nodes left to remove!' #32
Comments
I agree warnings should be disabled by default, but there is a simple way to disable them: require('neataptic').Config.warnings = false |
@momire the MNIST dataset is quite a big (and a little more complex) dataset. Calling the However, it is definitely possible to evolve a network to work on the MNIST dataset, it just requires the use of various options. I'll be posting an example soon, but in the mean time, use the option @alancnet I think I might create an extra option that disables warnings in functions that weren't called by the user. So warnings stay enabled when the user calles |
So a small example of training options which seems to work quite well for me (but still takes super long): var mnist = require('mnist');
var neataptic = require('neataptic');
var set = mnist.set(700,20);
var trainingSet = set.training;
var testset = set.test;
var network= new neataptic.Network(784,10);
for(var i = 0; i < network.connections.length; i++){
network.connections[i].weight = Math.random() * .02 - .01;
}
var results = network.evolve(trainingSet, {
iterations: 150, // just so it stopped after a while
elitism: 5, // 10%
mutationRate: 0.7, // fairly high, yes
equal: true,
growth: 0, // this can create very large networks, you can also remove this
log: 1 // see how well it's doing
}); As you might have noticed, I have a custom weight initialisation algorithm. Neataptic doesn't take into account that 784 connections cause any (sigmoid) connected neuron to activate 1 with default weight init. I already have a clean weight initialisation process in my head which i'm coding now. If you need more help with evolving the MNIST set let me know. PS: I think the MNIST dataset works better with a backprop + evolve technique, which I might incorperate in the future. |
I tried your advice and it worked! |
hello. I used the evolve method to test the mnist task. I referenced Tutorials in the evolution section and https://blog.webkid.io/neural-networks-in-javascript/ articles in this link.
The results were not good. I waited 20 hours to see the training of the network complete. But 'no more nodes left to remove!' Only the message that was repeated infinitely.
So this time, I tried using the train method instead of using the evolve method. Then the training was completed in just one minute. The test results were also good.
I can't know what's wrong. The same code worked well in the XOR example in evolution tutorial. The only difference is that the amount of data and the number of nodes are very little increased. Even that worked fine in the train method.
(Oh, while I was writing this question, now the results of the second attempt came out.
He is dead. lol)
.
Obviously there will be something I missed about the evolve method. But I don't know what I do not know.
The text was updated successfully, but these errors were encountered: