-
Notifications
You must be signed in to change notification settings - Fork 758
Open
Description
[2025-09-16T21:25:26.408Z] Running ./generation/anomaly_detection/anomalydetection_tutorial_classifier_guidance.ipynb
[2025-09-16T21:25:26.408Z] Checking PEP8 compliance...
[2025-09-16T21:25:27.035Z] Running notebook...
[2025-09-16T21:25:27.035Z] Before:
[2025-09-16T21:25:27.035Z] "max_epochs = 2000\n",
[2025-09-16T21:25:27.035Z] "max_epochs = 1000\n",
[2025-09-16T21:25:27.035Z] After:
[2025-09-16T21:25:27.035Z] "max_epochs = 1\n",
[2025-09-16T21:25:27.035Z] "max_epochs = 1\n",
[2025-09-16T21:25:27.035Z] Before:
[2025-09-16T21:25:27.035Z] "val_interval = 20\n",
[2025-09-16T21:25:27.035Z] "val_interval = 10\n",
[2025-09-16T21:25:27.319Z] After:
[2025-09-16T21:25:27.319Z] "val_interval = 1\n",
[2025-09-16T21:25:27.319Z] "val_interval = 1\n",
[2025-09-16T21:25:39.867Z] MONAI version: 1.5.0+20.g401ea4a0
[2025-09-16T21:25:39.867Z] Numpy version: 1.24.4
[2025-09-16T21:25:39.867Z] Pytorch version: 2.5.0a0+e000cf0ad9.nv24.10
[2025-09-16T21:25:39.867Z] MONAI flags: HAS_EXT = False, USE_COMPILED = False, USE_META_DICT = False
[2025-09-16T21:25:39.867Z] MONAI rev id: 401ea4a0af2b9d475989a78e1bae387875fa2384
[2025-09-16T21:25:39.867Z] MONAI __file__: /home/jenkins/agent/workspace/Monai-notebooks/MONAI/monai/__init__.py
[2025-09-16T21:25:39.867Z]
[2025-09-16T21:25:39.867Z] Optional dependencies:
[2025-09-16T21:25:39.867Z] Pytorch Ignite version: 0.4.11
[2025-09-16T21:25:39.867Z] ITK version: 5.4.4
[2025-09-16T21:25:39.867Z] Nibabel version: 5.3.2
[2025-09-16T21:25:39.867Z] scikit-image version: 0.25.2
[2025-09-16T21:25:39.867Z] scipy version: 1.14.0
[2025-09-16T21:25:39.867Z] Pillow version: 10.4.0
[2025-09-16T21:25:39.867Z] Tensorboard version: 2.16.2
[2025-09-16T21:25:39.867Z] gdown version: 5.2.0
[2025-09-16T21:25:39.867Z] TorchVision version: 0.20.0a0
[2025-09-16T21:25:39.867Z] tqdm version: 4.66.5
[2025-09-16T21:25:39.867Z] lmdb version: 1.7.3
[2025-09-16T21:25:39.867Z] psutil version: 6.0.0
[2025-09-16T21:25:39.867Z] pandas version: 2.2.2
[2025-09-16T21:25:39.867Z] einops version: 0.8.0
[2025-09-16T21:25:39.867Z] transformers version: 4.40.2
[2025-09-16T21:25:39.867Z] mlflow version: 3.3.2
[2025-09-16T21:25:39.867Z] pynrrd version: 1.1.3
[2025-09-16T21:25:39.867Z] clearml version: 2.0.3rc0
[2025-09-16T21:25:39.867Z]
[2025-09-16T21:25:39.867Z] For details about installing the optional dependencies, please visit:
[2025-09-16T21:25:39.867Z] https://docs.monai.io/en/latest/installation.html#installing-the-recommended-dependencies
[2025-09-16T21:25:39.867Z]
[2025-09-16T21:25:41.356Z] papermill --progress-bar --log-output -k python3
[2025-09-16T21:25:41.970Z] /usr/local/lib/python3.10/dist-packages/papermill/iorw.py:149: UserWarning: the file is not specified with any extension : -
[2025-09-16T21:25:41.970Z] warnings.warn(f"the file is not specified with any extension : {os.path.basename(path)}")
[2025-09-16T21:25:41.970Z]
[2025-09-16T21:25:42.980Z] Executing: 0%| | 0/32 [00:00<?, ?cell/s]
[2025-09-16T21:26:01.477Z] Executing: 3%|▎ | 1/32 [00:00<00:27, 1.11cell/s]
[2025-09-16T21:26:14.010Z] Executing: 9%|▉ | 3/32 [00:19<03:21, 6.95s/cell]
[2025-09-16T21:39:22.209Z] Executing: 16%|█▌ | 5/32 [00:30<02:51, 6.37s/cell]
[2025-09-16T21:40:09.192Z] Executing: 38%|███▊ | 12/32 [13:30<27:19, 81.95s/cell]
[2025-09-16T21:40:09.464Z] Executing: 44%|████▍ | 14/32 [14:27<20:56, 69.83s/cell]
[2025-09-16T21:40:12.780Z] Executing: 50%|█████ | 16/32 [14:27<14:03, 52.74s/cell]
[2025-09-16T21:40:27.806Z] Executing: 56%|█████▋ | 18/32 [14:30<09:12, 39.47s/cell]
[2025-09-16T21:40:27.806Z] Executing: 62%|██████▎ | 20/32 [14:43<06:07, 30.60s/cell]
[2025-09-16T21:40:30.448Z] Executing: 75%|███████▌ | 24/32 [14:45<02:19, 17.45s/cell]
[2025-09-16T21:40:30.448Z] Executing: 75%|███████▌ | 24/32 [14:48<04:56, 37.01s/cell]
[2025-09-16T21:40:30.734Z] /usr/local/lib/python3.10/dist-packages/papermill/iorw.py:149: UserWarning: the file is not specified with any extension : -
[2025-09-16T21:40:30.734Z] warnings.warn(f"the file is not specified with any extension : {os.path.basename(path)}")
[2025-09-16T21:40:30.734Z] Traceback (most recent call last):
[2025-09-16T21:40:30.734Z] File "/usr/local/bin/papermill", line 7, in <module>
[2025-09-16T21:40:30.734Z] sys.exit(papermill())
[2025-09-16T21:40:30.734Z] File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1157, in __call__
[2025-09-16T21:40:30.734Z] return self.main(*args, **kwargs)
[2025-09-16T21:40:30.734Z] File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1078, in main
[2025-09-16T21:40:30.734Z] rv = self.invoke(ctx)
[2025-09-16T21:40:30.734Z] File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1434, in invoke
[2025-09-16T21:40:30.734Z] return ctx.invoke(self.callback, **ctx.params)
[2025-09-16T21:40:30.734Z] File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 783, in invoke
[2025-09-16T21:40:30.734Z] return __callback(*args, **kwargs)
[2025-09-16T21:40:30.734Z] File "/usr/local/lib/python3.10/dist-packages/click/decorators.py", line 33, in new_func
[2025-09-16T21:40:30.734Z] return f(get_current_context(), *args, **kwargs)
[2025-09-16T21:40:30.734Z] File "/usr/local/lib/python3.10/dist-packages/papermill/cli.py", line 235, in papermill
[2025-09-16T21:40:30.734Z] execute_notebook(
[2025-09-16T21:40:30.734Z] File "/usr/local/lib/python3.10/dist-packages/papermill/execute.py", line 131, in execute_notebook
[2025-09-16T21:40:30.734Z] raise_for_execution_errors(nb, output_path)
[2025-09-16T21:40:30.734Z] File "/usr/local/lib/python3.10/dist-packages/papermill/execute.py", line 251, in raise_for_execution_errors
[2025-09-16T21:40:30.734Z] raise error
[2025-09-16T21:40:30.734Z] papermill.exceptions.PapermillExecutionError:
[2025-09-16T21:40:30.734Z] ---------------------------------------------------------------------------
[2025-09-16T21:40:30.734Z] Exception encountered at "In [12]":
[2025-09-16T21:40:30.734Z] ---------------------------------------------------------------------------
[2025-09-16T21:40:30.734Z] RuntimeError Traceback (most recent call last)
[2025-09-16T21:40:30.734Z] Cell In[12], line 28
[2025-09-16T21:40:30.734Z] 26 # Get model prediction
[2025-09-16T21:40:30.734Z] 27 noisy_img = scheduler.add_noise(images, noise, timesteps) # add t steps of noise to the input image
[2025-09-16T21:40:30.734Z] ---> 28 pred = classifier(noisy_img, timesteps)
[2025-09-16T21:40:30.734Z] 30 loss = F.cross_entropy(pred, classes.long())
[2025-09-16T21:40:30.734Z] 32 loss.backward()
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)
[2025-09-16T21:40:30.734Z] 1734 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]
[2025-09-16T21:40:30.734Z] 1735 else:
[2025-09-16T21:40:30.734Z] -> 1736 return self._call_impl(*args, **kwargs)
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)
[2025-09-16T21:40:30.734Z] 1742 # If we don't have any hooks, we want to skip the rest of the logic in
[2025-09-16T21:40:30.734Z] 1743 # this function, and just call forward.
[2025-09-16T21:40:30.734Z] 1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
[2025-09-16T21:40:30.734Z] 1745 or _global_backward_pre_hooks or _global_backward_hooks
[2025-09-16T21:40:30.734Z] 1746 or _global_forward_hooks or _global_forward_pre_hooks):
[2025-09-16T21:40:30.734Z] -> 1747 return forward_call(*args, **kwargs)
[2025-09-16T21:40:30.734Z] 1749 result = None
[2025-09-16T21:40:30.734Z] 1750 called_always_called_hooks = set()
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /home/jenkins/agent/workspace/Monai-notebooks/MONAI/monai/networks/nets/diffusion_model_unet.py:2059, in DiffusionModelEncoder.forward(self, x, timesteps, context, class_labels)
[2025-09-16T21:40:30.734Z] 2055 if self.out is None:
[2025-09-16T21:40:30.734Z] 2056 self.out = nn.Sequential(
[2025-09-16T21:40:30.734Z] 2057 nn.Linear(h.shape[1], 512), nn.ReLU(), nn.Dropout(0.1), nn.Linear(512, self.out_channels)
[2025-09-16T21:40:30.734Z] 2058 )
[2025-09-16T21:40:30.734Z] -> 2059 output: torch.Tensor = self.out(h)
[2025-09-16T21:40:30.734Z] 2061 return output
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)
[2025-09-16T21:40:30.734Z] 1734 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]
[2025-09-16T21:40:30.734Z] 1735 else:
[2025-09-16T21:40:30.734Z] -> 1736 return self._call_impl(*args, **kwargs)
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)
[2025-09-16T21:40:30.734Z] 1742 # If we don't have any hooks, we want to skip the rest of the logic in
[2025-09-16T21:40:30.734Z] 1743 # this function, and just call forward.
[2025-09-16T21:40:30.734Z] 1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
[2025-09-16T21:40:30.734Z] 1745 or _global_backward_pre_hooks or _global_backward_hooks
[2025-09-16T21:40:30.734Z] 1746 or _global_forward_hooks or _global_forward_pre_hooks):
[2025-09-16T21:40:30.734Z] -> 1747 return forward_call(*args, **kwargs)
[2025-09-16T21:40:30.734Z] 1749 result = None
[2025-09-16T21:40:30.734Z] 1750 called_always_called_hooks = set()
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/container.py:250, in Sequential.forward(self, input)
[2025-09-16T21:40:30.734Z] 248 def forward(self, input):
[2025-09-16T21:40:30.734Z] 249 for module in self:
[2025-09-16T21:40:30.734Z] --> 250 input = module(input)
[2025-09-16T21:40:30.734Z] 251 return input
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)
[2025-09-16T21:40:30.734Z] 1734 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]
[2025-09-16T21:40:30.734Z] 1735 else:
[2025-09-16T21:40:30.734Z] -> 1736 return self._call_impl(*args, **kwargs)
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)
[2025-09-16T21:40:30.734Z] 1742 # If we don't have any hooks, we want to skip the rest of the logic in
[2025-09-16T21:40:30.734Z] 1743 # this function, and just call forward.
[2025-09-16T21:40:30.734Z] 1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
[2025-09-16T21:40:30.734Z] 1745 or _global_backward_pre_hooks or _global_backward_hooks
[2025-09-16T21:40:30.734Z] 1746 or _global_forward_hooks or _global_forward_pre_hooks):
[2025-09-16T21:40:30.734Z] -> 1747 return forward_call(*args, **kwargs)
[2025-09-16T21:40:30.734Z] 1749 result = None
[2025-09-16T21:40:30.734Z] 1750 called_always_called_hooks = set()
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/nn/modules/linear.py:125, in Linear.forward(self, input)
[2025-09-16T21:40:30.734Z] 124 def forward(self, input: Tensor) -> Tensor:
[2025-09-16T21:40:30.734Z] --> 125 return F.linear(input, self.weight, self.bias)
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /home/jenkins/agent/workspace/Monai-notebooks/MONAI/monai/data/meta_tensor.py:283, in MetaTensor.__torch_function__(cls, func, types, args, kwargs)
[2025-09-16T21:40:30.734Z] 281 if kwargs is None:
[2025-09-16T21:40:30.734Z] 282 kwargs = {}
[2025-09-16T21:40:30.734Z] --> 283 ret = super().__torch_function__(func, types, args, kwargs)
[2025-09-16T21:40:30.734Z] 284 # if `out` has been used as argument, metadata is not copied, nothing to do.
[2025-09-16T21:40:30.734Z] 285 # if "out" in kwargs:
[2025-09-16T21:40:30.734Z] 286 # return ret
[2025-09-16T21:40:30.734Z] 287 if _not_requiring_metadata(ret):
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] File /usr/local/lib/python3.10/dist-packages/torch/_tensor.py:1453, in Tensor.__torch_function__(cls, func, types, args, kwargs)
[2025-09-16T21:40:30.734Z] 1450 return NotImplemented
[2025-09-16T21:40:30.734Z] 1452 with _C.DisableTorchFunctionSubclass():
[2025-09-16T21:40:30.734Z] -> 1453 ret = func(*args, **kwargs)
[2025-09-16T21:40:30.734Z] 1454 if func in get_default_nowrap_functions():
[2025-09-16T21:40:30.734Z] 1455 return ret
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument mat1 in method wrapper_CUDA_addmm)
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z]
[2025-09-16T21:40:30.734Z] real 14m49.487s
[2025-09-16T21:40:30.734Z] user 14m4.541s
[2025-09-16T21:40:30.734Z] sys 6m29.186s
Metadata
Metadata
Assignees
Labels
No labels