Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.unique leads to "RuntimeError: isDifferentiableType" when used in loss function #51271

Closed
sambaPython24 opened this issue Jan 28, 2021 · 4 comments
Labels

Comments

@sambaPython24
Copy link

sambaPython24 commented Jan 28, 2021

Hello, when I use the .unique function in PyTorch
as suggested in this post

combined = torch.cat((t1, t2))
uniques, counts = combined.unique(return_counts=True)
difference = uniques[counts == 1]
intersection = uniques[counts > 1]

and try to use it in a custom loss function, I get the following error

RuntimeError: isDifferentiableType(variable.scalar_type()) INTERNAL ASSERT FAILED at "..\\torch/csrc/autograd/functions/utils.h":64, please report a bug to PyTorch. 

cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser

@ailzhang ailzhang added high priority module: assert failure The issue involves an assert failure labels Jan 29, 2021
@ailzhang
Copy link
Contributor

Marking as high pri since it hits internal assert but the use case looks valid.

@ezyang
Copy link
Contributor

ezyang commented Jan 29, 2021

What version of PyTorch? Pretty sure @albanD fixed this one on master

@sambaPython24
Copy link
Author

Hey, it is

print(torch.__version__)
1.7.1

and it is run on a CPU. Do you need any other information?

@albanD
Copy link
Collaborator

albanD commented Feb 1, 2021

This has been fixed in master in #47930
You can use a nightly build to get the fix right now. Or wait for the upcoming 1.8 release.

@albanD albanD closed this as completed Feb 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants