New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spurious numpy writable warning #47271
Spurious numpy writable warning #47271
Conversation
Hi @ArtistBanda! Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file. In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
💊 CI failures summary and remediationsAs of commit 187a07e (more details on the Dr. CI page):
🕵️ 2 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages: binary_linux_libtorch_3_7m_cpu_devtoolset7_shared-with-deps_build (1/2)Step: "Checkout pytorch/builder repo" (full log | diagnosis details | 🔁 rerun)
|
Codecov Report
@@ Coverage Diff @@
## master #47271 +/- ##
===========================================
- Coverage 81.44% 60.81% -20.63%
===========================================
Files 1798 2748 +950
Lines 188223 254027 +65804
===========================================
+ Hits 153306 154499 +1193
- Misses 34917 99528 +64611 |
torch/csrc/utils/tensor_numpy.cpp
Outdated
@@ -125,13 +125,13 @@ PyObject* tensor_to_numpy(const at::Tensor& tensor) { | |||
return array.release(); | |||
} | |||
|
|||
at::Tensor tensor_from_numpy(PyObject* obj) { | |||
at::Tensor tensor_from_numpy(PyObject* obj, bool copy_numpy) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
at::Tensor tensor_from_numpy(PyObject* obj, bool copy_numpy) { | |
at::Tensor tensor_from_numpy(PyObject* obj, bool copy_numpy=false) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It throws an error if the default value is passed in the function implementation. So, false is passed during function declaration only.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
37518cd added the comment for reference in implementation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oops. sorry I meant to comment on the other macro path.
torch/csrc/utils/tensor_numpy.cpp
Outdated
@@ -7,7 +7,7 @@ namespace torch { namespace utils { | |||
PyObject* tensor_to_numpy(const at::Tensor& tensor) { | |||
throw std::runtime_error("PyTorch was compiled without NumPy support"); | |||
} | |||
at::Tensor tensor_from_numpy(PyObject* obj) { | |||
at::Tensor tensor_from_numpy(PyObject* obj, bool copy_numpy=false) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we also remove defaults here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But if the function is called without passing this argument it may fail. And by default, we can assume that a copy is not created and raise a warning for that.
248b9f6
to
37518cd
Compare
torch/csrc/utils/tensor_numpy.cpp
Outdated
if (!PyArray_Check(obj)) { | ||
throw TypeError("expected np.ndarray (got %s)", Py_TYPE(obj)->tp_name); | ||
} | ||
auto array = (PyArrayObject*)obj; | ||
|
||
if (!PyArray_ISWRITEABLE(array)) { | ||
if (!PyArray_ISWRITEABLE(array) && !copy_numpy) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you add a comment here explaining why this warning isn't need when copy_numpy is true?
torch/csrc/utils/tensor_numpy.cpp
Outdated
@@ -125,13 +125,13 @@ PyObject* tensor_to_numpy(const at::Tensor& tensor) { | |||
return array.release(); | |||
} | |||
|
|||
at::Tensor tensor_from_numpy(PyObject* obj) { | |||
at::Tensor tensor_from_numpy(PyObject* obj, bool copy_numpy/*=false*/) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a historic issue, but since this PR updates this function, anyway, would you add a brief comment explaining what this function does (and what the copy_numpy bool does)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also: what does /*=false*/
do in the signature?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Final nit on this signature: is copy_numpy
the correct name for the argument? That would imply that it controls the behavior of the function, but actually this is only used to know whether or not to throw a warning, right?
Would a name like, "warn_if_not_writeable" be more descriptive?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also: what does
/*=false*/
do in the signature?
It's just a comment to tell the default parameter passed is false.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a410107 changes the parameter name and adds the comment for reference.
@@ -1145,6 +1145,12 @@ def test_full_out(self, device): | |||
self.assertEqual(torch.full(o.shape, 1., out=o).dtype, o.dtype) | |||
self.assertEqual(torch.full(size, 1, out=o).dtype, o.dtype) | |||
|
|||
def test_tensor_from_non_writable_numpy(self, device): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add a comment with the issue this test addresses so people can reference it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a410107 adds the comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did the force push remove the comment including the issue?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry I'm confused. Where is the comment with the issue?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
44ff10e This contains the comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ArtistBanda I think @mruberry wants you to add a reference to the issue in this comment. Either the issue ID or the URL to the issue (#47160).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's right, thank you @thomasjo. I was expected a comment like
# Tests for X, see (link to issue)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh okay, sorry I didn't get that before. I'll add it now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
187a07e resolves this
Would you rebase this, please? |
Sure, it seems correct. Co-authored-by: Rong Rong <walterddr.walterddr@gmail.com>
a410107
to
44ff10e
Compare
This is my first time contributing to a huge open source project and I am still learning every nook and corner of this, and I know force pushing is not the right way to go but I would love some feedback and like to improve and contribute more. |
torch/csrc/utils/tensor_numpy.cpp
Outdated
if (!PyArray_ISWRITEABLE(array)) { | ||
// warn_if_not_writable is true when a copy of numpy variable is created. | ||
// the warning is suppressed when a copy is being created. | ||
if (!PyArray_ISWRITEABLE(array) && !warn_if_not_writeable) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
!warn_if_not_writeable
-> warn_if_not_writeable
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated in c9a7efa
torch/csrc/utils/tensor_new.cpp
Outdated
@@ -270,7 +270,7 @@ Tensor internal_new_from_data( | |||
|
|||
if (PyArray_Check(data)) { | |||
TORCH_CHECK(!pin_memory, "Can't pin tensor constructed from numpy"); | |||
auto tensor = tensor_from_numpy(data); | |||
auto tensor = tensor_from_numpy(data, copy_numpy); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
copy_numpy
-> !copy_numpy
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is, you want to warn_if_not_writeable
when this does not copy, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated in c9a7efa
No worries. This is just a git issue. You can typically just fetch and pull the main branch into your branch. |
@mruberry should I rebase again? |
No need. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Thanks @ArtistBanda!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Fixes #47160