Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed floor_divide deprecation warnings seen in pytest output #1455

Merged
merged 2 commits into from Apr 15, 2021

Conversation

prabhat00155
Copy link
Contributor

No description provided.

Copy link
Contributor

@vincentqb vincentqb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@prabhat00155
Copy link
Contributor Author

LGTM, thanks!

I can't merge the PR, I don't have write access to this repository.

@vincentqb vincentqb merged commit 4863030 into pytorch:master Apr 15, 2021
@vincentqb
Copy link
Contributor

I can't merge the PR, I don't have write access to this repository.

I know :) Thanks again for fixing this!

@prabhat00155 prabhat00155 deleted the prabhat00155/fix_warnings branch April 15, 2021 22:43
carolineechen pushed a commit to carolineechen/audio that referenced this pull request Apr 30, 2021
…h#1455)

* Fixed floor_divide deprecation warnings seen in pytest output

* Fixed warning in test_flanger_triangle_linear
mthrok pushed a commit to mthrok/audio that referenced this pull request Dec 13, 2022
The code at the end registers only the parameters from `model.fc` in the optimizer, although the text underneath says: "Notice although we register all the parameters in the optimizer, the only parameters that are computing gradients (and hence updated in gradient descent) are the weights and bias of the classifier."

To be consistent with this explanation, we should be adding all the parameters from the model.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants