-
Notifications
You must be signed in to change notification settings - Fork 765
Description
I'm running the inference on the pre-trained weights following the README.md and notebook instruction in https://github.com/Project-MONAI/tutorials/blob/main/generation/maisi/maisi_inference_tutorial.ipynb
When running maisi-rfow, the data are generated without any problem:
python -m scripts.inference -c ./configs/config_maisi3d-rflow.json -i ./configs/config_infer.json -e ./configs/environment_maisi3d-rflow.json --random-seed 0 --version maisi3d-rflow
When running the maisi-ddpm version:
python -m scripts.inference -c ./configs/config_maisi3d-ddpm.json -i ./configs/config_infer.json -e ./configs/environment_maisi3d-ddpm.json --random-seed 0 --version maisi3d-ddpm
I get the following error:
Traceback (most recent call last): File "<string>", line 1, in <module> File "/home/ubuntu/anaconda3/envs/maisi/lib/python3.12/site-packages/torch/nn/modules/module.py", line 2629, in load_state_dict raise RuntimeError( RuntimeError: Error(s) in loading state_dict for DiffusionModelUNetMaisi: Unexpected key(s) in state_dict: "down_blocks.2.attentions.0.attn.out_proj.weight", "down_blocks.2.attentions.0.attn.out_proj.bias", "down_blocks.2.attentions.1.attn.out_proj.weight", "down_blocks.2.attentions.1.attn.out_proj.bias", "down_blocks.3.attentions.0.attn.out_proj.weight", "down_blocks.3.attentions.0.attn.out_proj.bias", "down_blocks.3.attentions.1.attn.out_proj.weight", "down_blocks.3.attentions.1.attn.out_proj.bias", "middle_block.attention.attn.out_proj.weight", "middle_block.attention.attn.out_proj.bias", "up_blocks.0.attentions.0.attn.out_proj.weight", "up_blocks.0.attentions.0.attn.out_proj.bias", "up_blocks.0.attentions.1.attn.out_proj.weight", "up_blocks.0.attentions.1.attn.out_proj.bias", "up_blocks.0.attentions.2.attn.out_proj.weight", "up_blocks.0.attentions.2.attn.out_proj.bias", "up_blocks.1.attentions.0.attn.out_proj.weight", "up_blocks.1.attentions.0.attn.out_proj.bias", "up_blocks.1.attentions.1.attn.out_proj.weight", "up_blocks.1.attentions.1.attn.out_proj.bias", "up_blocks.1.attentions.2.attn.out_proj.weight", "up_blocks.1.attentions.2.attn.out_proj.bias".
There is a mismatch between the loaded weights and the generated models. There are extra attention blocks in the loaded weights that are not present in the model. As config files I'm using the default one provided in the repository, I only modified the spacing in config_infer.json:
{ "num_output_samples": 1, "body_region": ["chest"], "anatomy_list": ["lung tumor"], "controllable_anatomy_size": [], "num_inference_steps": 1000, "mask_generation_num_inference_steps": 1000, "output_size": [ 256, 256, 256 ], "image_output_ext": ".nii.gz", "label_output_ext": ".nii.gz", "spacing": [ 1.0, 1.0, 1.0 ], "autoencoder_sliding_window_infer_size": [48,48,48], "autoencoder_sliding_window_infer_overlap": 0.6666, "controlnet": "$@controlnet_def", "diffusion_unet": "$@diffusion_unet_def", "autoencoder": "$@autoencoder_def", "mask_generation_autoencoder": "$@mask_generation_autoencoder_def", "mask_generation_diffusion": "$@mask_generation_diffusion_def", "modality": 1 }
Environment
- OS Linux
- Python version 3.12
- MONAI 1.5.1
- CUDA 12.4
- GPU A100, 40GB VRAM