Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Steepest gradiant value #70

Closed
msverma101 opened this issue Nov 25, 2020 · 4 comments
Closed

Steepest gradiant value #70

msverma101 opened this issue Nov 25, 2020 · 4 comments

Comments

@msverma101
Copy link

i want to be able to pull the value of steepest gradiant and use it in my code as a integer value.
i see in the code it is under lr_finder.plot() but i am not abke to just assign min_grad to anything
can you please help me

@NaleRaphael
Copy link
Contributor

Hi @snoozeyouloose.

lr_finder.plot() will return a matplotlib.axes.Axes object and a suggested learning rate (which is currently determined by a point with negative & minimal gradient) when given argument suggest_lr is True.

Although you can't get the gradient of that point directly, you can still get it by performing the operation written in line 505 since history of learning rates and losses are all recorded in the lr_finder.history dictionary:

if suggest_lr:
# 'steepest': the point with steepest gradient (minimal gradient)
print("LR suggestion: steepest gradient")
min_grad_idx = None
try:
min_grad_idx = (np.gradient(np.array(losses))).argmin()
except ValueError:
print(
"Failed to compute the gradients, there might not be enough points."
)

And I'm a bit curious about why you are trying to get the gradient of that point and using it as an integer. Because it's usually not necessary to know the gradient of that point, and it's also a bit weird to cast it (a float number) into an integer.

If you can provide more context of this problem, I am willing to dicuss it further with you.

@msverma101
Copy link
Author

msverma101 commented Nov 25, 2020 via email

@NaleRaphael
Copy link
Contributor

With regard to the problem of finding the best learning rate, you can just set the argument suggest_lr of lr_finder.plot() to True, then you can get the suggested value without recalculating it by yourself. As it stated in my previous comment, that suggested learning rate is currently determined by the steepest gradient, so it should meet your needs.

And if you want to prevent program getting stuck by the popup window resulted by plt.show() in lr_finder.plot(), you can pre-create a matplotlib.axex.Axes object and assign it to the argument ax of lr_finder.plot(). See also this example written by David: #23 (comment)

However, there are some reasons that we don't call the suggested learning rate as best learning rate, you may want to check out this thread for more details.

As for the R2 score for each model, I think it can be easily calculated by using sklearn.metrics.r2_score() after those models are well trained.

If you have any further questions, please just feel free to let me know!

@davidtvs
Copy link
Owner

davidtvs commented Feb 3, 2024

Closing due to inactivity

@davidtvs davidtvs closed this as completed Feb 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants