-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Steepest gradiant value #70
Comments
Hi @snoozeyouloose.
Although you can't get the gradient of that point directly, you can still get it by performing the operation written in line 505 since history of learning rates and losses are all recorded in the pytorch-lr-finder/torch_lr_finder/lr_finder.py Lines 500 to 509 in acc5e7e
And I'm a bit curious about why you are trying to get the gradient of that point and using it as an integer. Because it's usually not necessary to know the gradient of that point, and it's also a bit weird to cast it (a float number) into an integer. If you can provide more context of this problem, I am willing to dicuss it further with you. |
I don't necessarily want to cast the steepest gradient.
I want to use the best possible learning rate n use that in my code.. i m
trying to run a program that iterativly finds the best loss over the best
learning rate with variable number of hidden layer from 1 to 7 and 10
variable seeds.
I m stuck at 2 placed first is the lrfinder best learning rate.. n R2score
for each model..
If you have any suggestions please let me know
…On Wed, 25 Nov 2020, 8:13 am Nale Raphael ***@***.***> wrote:
Hi @snoozeyouloose <https://github.com/snoozeyouloose>.
lr_finder.plot() will return a matplotlib.axes.Axes object and a
suggested learning rate (which is currently determined by a point with
negative & minimal gradient) when given argument suggest_lr is True.
Although you can't get the gradient of that point directly, you can still
get it by performing the operation written in line 505 since history of
learning rates and losses are all recorded in the lr_finder.history
dictionary:
https://github.com/davidtvs/pytorch-lr-finder/blob/acc5e7ee7711a460bf3e1cc5c5f05575ba1e1b4b/torch_lr_finder/lr_finder.py#L500-L509
And I'm a bit curious about why you are trying to get the *gradient* of
that point and using it as an *integer*. Because it's usually not
necessary to know the gradient of that point, and it's also a bit weird to
cast it (a float number) into an integer.
If you can provide more context of this problem, I am willing to dicuss it
further with you.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#70 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/APFLIGPQ2MEA3W47NMA6ZIDSRSVCLANCNFSM4UBUL7CA>
.
|
With regard to the problem of finding the best learning rate, you can just set the argument And if you want to prevent program getting stuck by the popup window resulted by However, there are some reasons that we don't call the suggested learning rate as best learning rate, you may want to check out this thread for more details. As for the R2 score for each model, I think it can be easily calculated by using If you have any further questions, please just feel free to let me know! |
Closing due to inactivity |
i want to be able to pull the value of steepest gradiant and use it in my code as a integer value.
i see in the code it is under lr_finder.plot() but i am not abke to just assign min_grad to anything
can you please help me
The text was updated successfully, but these errors were encountered: