Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add note about in-place weight modification for nn.Embedding #45595

Closed

Conversation

kurtamohler
Copy link
Collaborator

Fixes #26596

@kurtamohler
Copy link
Collaborator Author

CI failure doesn't seem to be my fault, so I think this is ready for review.

@mrshenli mrshenli added module: nn Related to torch.nn triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Oct 2, 2020
@mruberry mruberry removed the request for review from apaszke October 6, 2020 03:27
torch/nn/modules/sparse.py Outdated Show resolved Hide resolved
.. note::
When :attr:`max_norm` is not ``None``, :class:`Embedding`'s forward method will modify the
:attr:`weight` Tensor in-place, rather than setting ``Embedding.weight`` to a new Tensor.
Because of this, if you want to perform some calculation with ``Embedding.weight`` before
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we be clearer about the reason for this? Maybe something like,

"Since tensors needed for gradient computations cannot be modified inplace, using the Embedding.weight tensor before calling Embedding's forward method requires cloning it when max_norm is not None. For example::"

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated this to be similar to your suggestion, just with a couple extra details

torch/nn/modules/sparse.py Outdated Show resolved Hide resolved
@mruberry
Copy link
Collaborator

mruberry commented Oct 6, 2020

Hey @kurtamohler! This is a great, well-written fix for a confusing issue. I made a couple suggestions and I look forward to hearing your thoughts on them.

@kurtamohler kurtamohler force-pushed the embedding-inplace-mod-doc-26596 branch from 393dc26 to e483241 Compare October 6, 2020 16:02
@kurtamohler
Copy link
Collaborator Author

@mruberry , thanks for the comments. I've made the changes you asked for, so I think it's good to go

@mruberry mruberry self-requested a review October 6, 2020 19:02
Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @kurtamohler! Nice fix.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@codecov
Copy link

codecov bot commented Oct 6, 2020

Codecov Report

❗ No coverage uploaded for pull request base (master@b1373a7). Click here to learn what that means.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff            @@
##             master   #45595   +/-   ##
=========================================
  Coverage          ?   68.19%           
=========================================
  Files             ?      410           
  Lines             ?    53232           
  Branches          ?        0           
=========================================
  Hits              ?    36303           
  Misses            ?    16929           
  Partials          ?        0           
Impacted Files Coverage Δ
torch/nn/modules/sparse.py 93.69% <ø> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b1373a7...19efeb5. Read the comment docs.

@facebook-github-bot
Copy link
Contributor

@mruberry merged this pull request in ed1552a.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged module: nn Related to torch.nn open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

nn.Embedding with max_norm shows unstable behavior and causes sometimes runtime error.
5 participants