You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It is one of the common use cases for CuPy to be used in conjunction with PyTorch. One of the difficulties when using two CUDA-powered libraries (including but not limited to CuPy/PyTorch combination) is sharing memory pools and streams between them. Currently, utility functions to share CuPy/PyTorch memory pools/streams are provided in pytorch-pfn-extras (pytorch_pfn_extras.cuda.*), however, it makes sense to host them in CuPy if there is a demand.
Share the same CUDA stream between CuPy and PyTorch
This RFC issue intends to gather interest for this feature from the community, and discuss where to put these methods (cupyx.???) once we decide to do that.
Additional Information
No response
The text was updated successfully, but these errors were encountered:
Description
It is one of the common use cases for CuPy to be used in conjunction with PyTorch. One of the difficulties when using two CUDA-powered libraries (including but not limited to CuPy/PyTorch combination) is sharing memory pools and streams between them. Currently, utility functions to share CuPy/PyTorch memory pools/streams are provided in pytorch-pfn-extras (pytorch_pfn_extras.cuda.*), however, it makes sense to host them in CuPy if there is a demand.
Specific features in mind are:
This RFC issue intends to gather interest for this feature from the community, and discuss where to put these methods (
cupyx.???
) once we decide to do that.Additional Information
No response
The text was updated successfully, but these errors were encountered: