Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix infer np scalar dtype mem leak #24267

Closed
wants to merge 1 commit into from

Conversation

ssnl
Copy link
Collaborator

@ssnl ssnl commented Aug 13, 2019

Fixes #24200 . I'm a bit worried that the test might be flaky...

@ssnl ssnl requested a review from gchanan August 13, 2019 19:55
@pytorchbot pytorchbot added module: dataloader Related to torch.utils.data.DataLoader and Sampler module: internals Related to internal abstractions in c10 and ATen module: operators module: tests Issues related to tests (not the torch.testing module) labels Aug 13, 2019
@gchanan
Copy link
Contributor

gchanan commented Aug 13, 2019

If we aren't confident in the test, we can just not add the test.

@@ -166,7 +166,10 @@ ScalarType infer_scalar_type(PyObject *obj) {
return numpy_dtype_to_aten(PyArray_TYPE((PyArrayObject*)obj));
}
if (PyArray_CheckScalar(obj)) {
return numpy_dtype_to_aten(PyArray_TYPE((PyArrayObject*)(PyArray_FromScalar(obj, nullptr))));
auto arr = PyArray_FromScalar(obj, nullptr);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we just use THPObjectPtr or whatever here so we don't have to do manually memory management?

@ssnl
Copy link
Collaborator Author

ssnl commented Aug 14, 2019

@gchanan This is now updated to use THPObjectPtr. As for tests, I was not sure if they will be flaky because of other potential (de)allocations going on. But from the previous commit they seem stable, at least for that set of CI runs. So I am keeping them for now.

@ssnl
Copy link
Collaborator Author

ssnl commented Aug 23, 2019

@gchanan pinging for review :)

@ssnl
Copy link
Collaborator Author

ssnl commented Aug 27, 2019

I removed the test.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gchanan has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@gchanan
Copy link
Contributor

gchanan commented Aug 27, 2019

I'll land it -- are you planning to add the test in a follow up?

@ssnl
Copy link
Collaborator Author

ssnl commented Aug 27, 2019

@gchanan It'll be difficult to add a test without examine CPU memory usage, which could be flaky in Python. I had one such test in an earlier version of this PR, and CI passed fine. But I was worried about the potential flakiness and removed it. In the end, it is unlikely that some future patch breaks this.

@ssnl ssnl deleted the infer_np_scalar_mem_leak branch August 28, 2019 15:12
@facebook-github-bot
Copy link
Contributor

@gchanan merged this pull request in 6c9410f.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged module: dataloader Related to torch.utils.data.DataLoader and Sampler module: internals Related to internal abstractions in c10 and ATen module: tests Issues related to tests (not the torch.testing module)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Memory leak in torch.tensor
5 participants