Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid unintended eager cuda initialization #199

Open
wants to merge 21 commits into
base: master
Choose a base branch
from

Conversation

dkasapp
Copy link
Collaborator

@dkasapp dkasapp commented Dec 24, 2021

We noticed the package initialization for sru is eagerly triggering the initialization because of the following stack of module imports sru.modules -> sru.ops -> cuda_functional and this last module is executing the function load of torch.utils.cpp_extension.

This was detected because of issues caused when running with the server framework in SUBPROCESS_MODE, that is forking a new process for it to run the model. We got an error complaining that CUDA had been already initialized in the parent process, which was not necessary because it is not meant to run the inference in the model.

This PR changes this loading to be more lazy, more concretely we changed the code in sru.modules to avoid the eager import of sru.ops and instead postpone it to the instantiation of a first SRUCell.

The changes in this PR have been tested doing a checkout of this branch in an AWS instance with GPU and running pytest -sv test which resulted in 141 passed, 161 warnings and no failures. So we understand this is working as expected for both CPU and GPU settings.

SRUCell.init_elementwise_recurrence_funcs()

@classmethod
def init_elementwise_recurrence_funcs(cls):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add a docstring to this method please

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added

@@ -27,6 +23,11 @@ class SRUCell(nn.Module):
scale_x: Tensor
weight_proj: Optional[Tensor]

initialized = False
elementwise_recurrence_inference = None
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add a note about this function initialization to the SRUCell docstring or the docstring for sru/modules.py (it lacks a module docstring now, it probably should have one)

Copy link
Collaborator Author

@dkasapp dkasapp Dec 27, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants