Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weight conversion error from GPU/CPU weights #308

Closed
ivan-ea opened this issue Nov 3, 2022 · 2 comments
Closed

Weight conversion error from GPU/CPU weights #308

ivan-ea opened this issue Nov 3, 2022 · 2 comments

Comments

@ivan-ea
Copy link

ivan-ea commented Nov 3, 2022

I got the following error when trying to use the
convert-torch-weights-to-torchscript command on the CLI.

\bioimageio\core\weight_converter\torch\utils.py:9 in load_model     
│                                                                                                  │
│    6 # and for each weight format                                                                │
│    7 def load_model(node):                                                                       │
│    8 │   model = PytorchModelAdapter.get_nn_instance(node)                                       │
│ >  9 │   state = torch.load(node.weights["pytorch_state_dict"].source)                           │
│   10 │   model.load_state_dict(state)                                                            │
│   11 │   model.eval()                                                                            │
│   12 │   return model

RuntimeError: Attempting to deserialize object on a CUDA device but
torch.cuda.is_available() is False. If you are running on a CPU-only machine,
please use torch.load with map_location=torch.device('cpu') to map your storages
to the CPU.

The problem appeared when using the models:

@constantinpape
Copy link
Contributor

Yes, this is a problem for models that were saved with the weights on the GPU. In this case we can just load the model on the cpu by default, which will be done by #309. (It will take a few days till we get out a release with these changes, if you don't want to wait for this you can just go ahead and make this change locally.)

@ivan-ea
Copy link
Author

ivan-ea commented Nov 3, 2022

Thank you @constantinpape I will try changing it locally for the time being.

@FynnBe FynnBe closed this as completed Nov 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants