Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backpropagation #1

Open
t3db0t opened this issue Jul 1, 2010 · 5 comments
Open

Backpropagation #1

t3db0t opened this issue Jul 1, 2010 · 5 comments
Labels

Comments

@t3db0t
Copy link
Owner

t3db0t commented Jul 1, 2010

Support for multi-layer perceptrons via backpropagation.

@Pranavgulati
Copy link

Please review github.com/pranavgulati/neuralDuino , i have implemented backpropagation and the library in itself will support 'n' layer 'm' nodes per layer (until a RAM overflow ;-P) . But that didnt help solving the XOR problem, Care to contribute?

@t3db0t
Copy link
Owner Author

t3db0t commented Jan 9, 2017

You have to implement a specific architecture to train XOR, it's not just fully-connected layer-by-layer: http://www.mind.ilstu.edu/curriculum/artificial_neural_net/xor_problem_and_solution.php

Is there any reason you felt you needed to create a new library rather than extend this one?

@Pranavgulati
Copy link

i'll try the architecture given in the link, until now i was trying 2 input nodes that are fully connected to 2 nodes in the hidden layer which are further connected to one output node, this did not work as you point out but i have gone through so many videos and links which manage to train it with this architecture itself (http://playground.tensorflow.org/)

the link that you have has a different architecture and i hope that it works thanks :-)

@Pranavgulati
Copy link

the reason i had to make a new library instead of extending this is that this library manages nodes as layers and is limited to a two layer perceptron network whereas the library that i have created leaves the layer management to the user. so in effect the user can design any kind of neural network with any no. of nodes and layers to his liking and at the same time the user can connect any node to any other node intuitively, basically there is no concept of layer if the user doesnt want it to be there + i also implemented backpropagation which was a reported issue for this library

@t3db0t
Copy link
Owner Author

t3db0t commented Jan 9, 2017

Sure, makes sense :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants