-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add learning rate multiplier to network's graph #6244
Comments
It might be useful, but in order to reduce clutter in the graph, I would recommend the I'd be glad to review a PR with this feature, if you wanted to implement it. |
Sure! |
Excellent! I will close this issue for now - we shall have further conversation under the PR when you submit it. |
@nic25 Could we retain the kernel_size/stride/pad/etc. info even when lr_mult is zero? Currently you're removing all information, while I think having the basic layer info regardless of the lr_mult would be useful. Other than that it looks good. |
caffe/python/caffe/draw.py
Line 89 in daf0139
IMHO including information about learning rate multipliers could be helpful when finetuning a network. We could:
The text was updated successfully, but these errors were encountered: