Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is NN structure? #1

Closed
JBMing opened this issue Apr 3, 2016 · 5 comments
Closed

What is NN structure? #1

JBMing opened this issue Apr 3, 2016 · 5 comments

Comments

@JBMing
Copy link

JBMing commented Apr 3, 2016

I did't find the information of neural network such as the number of layers and the neurons in each layer.Thus,can u explain the explicit information in readme?

@tambetm
Copy link
Owner

tambetm commented Apr 3, 2016

Currently the NN has only fully connected layers. The sizes of hidden layers are passed to NNAgent constructor as opts.layers parameter. For example opts.layers = [1000]; creates one hidden layer with 1000 nodes, opts.layers = [256 256]; creates two hidden layers with 256 nodes each. README seems outdated, but example.m should be OK. Will fix it ASAP.

@JBMing
Copy link
Author

JBMing commented Apr 20, 2016

Do you mean just one fully connected layers and not have convolutional layer?
Besides, I am confused by the readme and don't how execute the code.
Whether should I install some other environment?

@tambetm
Copy link
Owner

tambetm commented Apr 20, 2016

You can have several fully connected layers by including many numbers in list, i.e. opts.layers = [256 256]. But no convolutional layers.

I would suggest to check out example code in https://github.com/tambetm/matlab2048/blob/master/example.m, that should work.

You need to check out DeepLearnToolbox from GitHub as suggested in README. No other packages are needed.

@JBMing
Copy link
Author

JBMing commented Apr 22, 2016

I am trying to modify the algorithm of weight updation.
But I have question for the code.
Where is the code that achieve back=propagation and gradient descent ?
Could u tell me ?

@tambetm
Copy link
Owner

tambetm commented Apr 24, 2016

Backpropagation and gradient descent are implemented in DeepLearnToolbox. I'm using my own fork, which has some minor modifications, not relevant to this project AFAIK. The backpropagation is implemented in NN/nnff.m, NN/nnbp.m and NN/nnapplygrads.m.

@tambetm tambetm closed this as completed Apr 24, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants