Skip to content
This repository was archived by the owner on Jan 22, 2025. It is now read-only.

Conversation

@fbgheith
Copy link
Contributor

Summary:
Lazy import changes Python import semantics, specifically when it comes to initialization of packages/modules: https://www.internalfb.com/intern/wiki/Python/Cinder/Onboarding/Tutorial/Lazy_Imports/Troubleshooting/

For example, this pattern is not guaranteed to work:

import torch.optim
...
torch.optim._multi_tensor.Adam   # may fail to resolve _multi_tensor

And this is guaranteed to work:

import torch.optim._multi_tensor
...
torch.optim._multi_tensor.Adam   # will always work

A recent change to PyTorch changed module initialization logic in a way that exposed this issue.

But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the PyTorch code base can make it fail.

Differential Revision: D58876582

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 21, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58876582

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58876582

fbgheith added a commit to fbgheith/d2go that referenced this pull request Jun 21, 2024
Summary:
Pull Request resolved: facebookresearch#668

Lazy import changes `Python` import semantics, specifically when it comes to initialization of packages/modules: https://www.internalfb.com/intern/wiki/Python/Cinder/Onboarding/Tutorial/Lazy_Imports/Troubleshooting/

For example, this pattern is not guaranteed to work:

```
import torch.optim
...
torch.optim._multi_tensor.Adam   # may fail to resolve _multi_tensor
```

And this is guaranteed to work:

```
import torch.optim._multi_tensor
...
torch.optim._multi_tensor.Adam   # will always work
```

A recent change to `PyTorch` changed module initialization logic in a way that exposed this issue.

But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the `PyTorch` code base can make it fail.

Reviewed By: wat3rBro

Differential Revision: D58876582
Summary:
Pull Request resolved: facebookresearch#668

Lazy import changes `Python` import semantics, specifically when it comes to initialization of packages/modules: https://www.internalfb.com/intern/wiki/Python/Cinder/Onboarding/Tutorial/Lazy_Imports/Troubleshooting/

For example, this pattern is not guaranteed to work:

```
import torch.optim
...
torch.optim._multi_tensor.Adam   # may fail to resolve _multi_tensor
```

And this is guaranteed to work:

```
import torch.optim._multi_tensor
...
torch.optim._multi_tensor.Adam   # will always work
```

A recent change to `PyTorch` changed module initialization logic in a way that exposed this issue.

But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the `PyTorch` code base can make it fail.

Reviewed By: wat3rBro

Differential Revision: D58876582
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58876582

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 040a716.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants