You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I got the following error when trying to use the convert-torch-weights-to-torchscript command on the CLI.
\bioimageio\core\weight_converter\torch\utils.py:9 in load_model
│ │
│ 6 # and for each weight format │
│ 7 def load_model(node): │
│ 8 │ model = PytorchModelAdapter.get_nn_instance(node) │
│ > 9 │ state = torch.load(node.weights["pytorch_state_dict"].source) │
│ 10 │ model.load_state_dict(state) │
│ 11 │ model.eval() │
│ 12 │ return model
RuntimeError: Attempting to deserialize object on a CUDA device but
torch.cuda.is_available() is False. If you are running on a CPU-only machine,
please use torch.load with map_location=torch.device('cpu') to map your storages
to the CPU.
Yes, this is a problem for models that were saved with the weights on the GPU. In this case we can just load the model on the cpu by default, which will be done by #309. (It will take a few days till we get out a release with these changes, if you don't want to wait for this you can just go ahead and make this change locally.)
I got the following error when trying to use the
convert-torch-weights-to-torchscript
command on the CLI.The problem appeared when using the models:
The text was updated successfully, but these errors were encountered: