Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add gradient output functionality in Neural Nets #2840

Merged
merged 1 commit into from May 10, 2015

Conversation

sanuj
Copy link
Contributor

@sanuj sanuj commented May 6, 2015

@lisitsyn Have a look. Currently I'm printing the max and mean gradients after each iteration. Something like: https://gist.github.com/sanuj/4994d1d8feccdf069400

@@ -309,6 +309,18 @@ bool CNeuralNetwork::train_gradient_descent(SGMatrix<float64_t> inputs,

float64_t e = compute_gradients(inputs_batch, targets_batch, gradients);


for (int32_t k=0; k<m_num_layers; k++)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lisitsyn This for loop can be made a private member function if you want as it is getting repeated in lbfgs also.

@sanuj sanuj mentioned this pull request May 6, 2015
@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 9f55012 on sanuj:feature/grad_output into * on shogun-toolbox:develop*.

SGVector<float64_t> layer_gradients = network->get_section(gradients, i);
if (layer_gradients.vlen > 0)
{
SG_SINFO("Layer %i, Max Gradient: %g, Mean Gradient: %g.\n", i,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For convenience, can we output the layer name as well?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, i'll do it.
On 10 May 2015 23:10, "Sergey Lisitsyn" notifications@github.com wrote:

In src/shogun/neuralnets/NeuralNetwork.cpp
#2840 (comment):

@@ -408,6 +420,19 @@ int CNeuralNetwork::lbfgs_progress(void* instance,
int n, int k, int ls)
{
SG_SINFO("Epoch %i: Error = %f\n",k, fx);
+

  • CNeuralNetwork* network = static_cast<CNeuralNetwork*>(instance);
  • SGVector<float64_t> gradients((float64_t*)g, network->get_num_parameters(), false);
  • for (int32_t i=0; im_num_layers; i++)
  • {
  •   SGVector<float64_t> layer_gradients = network->get_section(gradients, i);
    
  •   if (layer_gradients.vlen > 0)
    
  •   {
    
  •       SG_SINFO("Layer %i, Max Gradient: %g, Mean Gradient: %g.\n", i,
    

For convenience, can we output the layer name as well?


Reply to this email directly or view it on GitHub
https://github.com/shogun-toolbox/shogun/pull/2840/files#r30004276.

@sanuj
Copy link
Contributor Author

sanuj commented May 10, 2015

@lisitsyn Done. Have a look.

@lisitsyn
Copy link
Member

Looks good to merge. Thanks!

lisitsyn added a commit that referenced this pull request May 10, 2015
Add gradient output functionality in Neural Nets
@lisitsyn lisitsyn merged commit 99d40ce into shogun-toolbox:develop May 10, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants