Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

storage resize_ function #26

Closed
mingminzhen opened this issue Feb 10, 2018 · 4 comments
Closed

storage resize_ function #26

mingminzhen opened this issue Feb 10, 2018 · 4 comments

Comments

@mingminzhen
Copy link

I try to use torch.storage in my network. I use Pytorch3.

if self.storage.size() < size:
            is_cuda = self.storage.is_cuda
            if is_cuda:
                gpu_ID = self.storage.get_device()
                print('gpu_ID1:',gpu_ID)
            self.storage.resize_(size)
            
            gpu_ID= self.storage.get_device()
            print('gpu_ID2:',gpu_ID)
            
            if is_cuda:
                self.storage = self.storage.cuda(gpu_ID)
            gpu_ID= self.storage.get_device()
            print('gpu_ID3:',gpu_ID)

The output is

gpu_ID1: 1
gpu_ID2: 0
gpu_ID3: 0

The self.storage comes from self.storage= torch.Storage(1024).
It seems the resize_ function will change the gpu where the storage will be saved.
I wish the storage is saved in GPU 1 rather than 0.
How can i do that?

@gpleiss
Copy link
Owner

gpleiss commented Feb 23, 2018

Right now things are a bit broken on PyTorch 0.3. I'm working on a fix for it.

@mingminzhen
Copy link
Author

In pytorch0.3, "with" can solve this problem.

    def resize_(self, size):
        if self.storage.size() < size:
            is_cuda = self.storage.is_cuda
            if is_cuda:
                gpu_ID = self.storage.get_device()
                with torch.cuda.device(gpu_ID):
                    self.storage.resize_(size)
            else:
                self.storage.resize_(size)
        return self

@gpleiss
Copy link
Owner

gpleiss commented Feb 28, 2018

Thanks @mingminzhen ! I'm going to push some 0.3 fixes soon, so I'll include that.

@gpleiss
Copy link
Owner

gpleiss commented Mar 13, 2018

This should be fixed with #28.

@gpleiss gpleiss closed this as completed Mar 13, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants