Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add learning rate multiplier to network's graph #6244

Closed
nic25 opened this issue Feb 19, 2018 · 5 comments
Closed

add learning rate multiplier to network's graph #6244

nic25 opened this issue Feb 19, 2018 · 5 comments

Comments

@nic25
Copy link
Contributor

nic25 commented Feb 19, 2018

node_label = '"%s%s(%s)%skernel size: %d%sstride: %d%spad: %d"' %\

IMHO including information about learning rate multipliers could be helpful when finetuning a network. We could:

  • add text information
  • change the color saturation to reflect the learning rate multiplier, either "washing out" layers where the LR is set to 0 or scaling the color saturation of each node depending of the learning rate
@Noiredd
Copy link
Member

Noiredd commented Feb 20, 2018

It might be useful, but in order to reduce clutter in the graph, I would recommend the lr_mult not be displayed by default - maybe it should be controlled via an optional argument?
As for the presentation, I don't think scaling the saturation is going to be a good idea, from the informational point of view. Text will work much better, hiding the label for layers with zero learning rate could also be considered.

I'd be glad to review a PR with this feature, if you wanted to implement it.

@nic25
Copy link
Contributor Author

nic25 commented Feb 22, 2018

Sure!

@Noiredd
Copy link
Member

Noiredd commented Feb 26, 2018

Excellent! I will close this issue for now - we shall have further conversation under the PR when you submit it.

@nic25
Copy link
Contributor Author

nic25 commented Feb 28, 2018

dysplaying LRM
display lr
dysplaying LRM/removing labels if LRM is 0
remove labels from layers with 0 lr

both work fine - i'll submit a pr when either can be optionally activated

@Noiredd
Copy link
Member

Noiredd commented Feb 28, 2018

@nic25 Could we retain the kernel_size/stride/pad/etc. info even when lr_mult is zero? Currently you're removing all information, while I think having the basic layer info regardless of the lr_mult would be useful. Other than that it looks good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants