-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
iterative gradient descent using theno #9
iterative gradient descent using theno #9
Conversation
PEP8, please use Python 3 print function, and it seems not to work:
|
Opz, accidently I commented line 21. |
Did I violated PEP8 again? dammit! This time I was using sublime-text-2 and I will read the whole thing again! |
Copy your code to http://pep8online.com/ - then you will see that there are plenty of PEP8 violations. |
Could you please explain your code? It seems not to implement gradient descent. For example, where do you calculate the gradient? The code seems to make only a few matrix operations (additions and dot products). What are your neurons? What represents hidden layers? By the way, have you seen https://github.com/MartinThoma/nntoolkit/blob/master/nntoolkit/train.py ? |
The image and the code looks rather as if you implemented linear regression. |
OOpz, in issue 4 (#4) you were talking about "batch gradient descent" so I thaught you were talking about http://en.wikipedia.org/wiki/Stochastic_gradient_descent. Probably I have mistaken something then :/ |
There are three variants of gradient descent:
I want to implement mini-batch gradient descent, as you can change a parameter and get both other variants. You implementations seems not to be able to be generizable to fit higher-order function (higher than linear) to data. Am I wrong? |
Alright leave it here. I think I messed up :P |
I mean it is gradient descent regression, you were looking for classification [for nn]. Alright, I also need to read about nn, I will get back after that. |
Implemented Iterative Gradient Descent [2D] using theno, tested using sklearn, plots look good.