New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
treat Parameter the same way as Tensor #48963
Changes from 4 commits
5706906
8556408
1a40adb
a77a5c4
15b4f64
3a4538f
c29817e
8713e5a
091011f
1b15442
350680c
3c6b4c9
1c46733
e1cd333
030e779
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -35,6 +35,14 @@ PyObject* THPAutograd_initExtension(PyObject* _unused, PyObject *unused) { | |
auto _C_m = py::handle(torch_C_module).cast<py::module>(); | ||
auto m = _C_m.def_submodule("_autograd", "autograd bindings"); | ||
|
||
auto parameter_module = THPObjectPtr(PyImport_ImportModule("torch.nn.parameter")); | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Doesn't really matter much, but out of curiosity, is this a There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It looks like there's an analog from TH. (Based on some grep-ing it seems to just be handling the incref and decref.) I just cargo cult'd the other ones. |
||
if (!parameter_module) | ||
return nullptr; | ||
|
||
// NOTE: "leaks" ParameterClass | ||
ParameterClass = PyObject_GetAttrString(parameter_module, "Parameter"); | ||
if (!ParameterClass) | ||
return nullptr; | ||
|
||
py::enum_<ProfilerState>(m, "ProfilerState") | ||
.value("Disabled", ProfilerState::Disabled) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do you need this extra check here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a fast path. I've come to the conclusion that because I've written something very similar to
check_has_torch_function
in #48965, the logic really should be merged. Which probably means the check will be removed from this PR, andcheck_torch_function
will be replaced later in the PR stack.