You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
python ddpm_cd.py --config config/whu_test.json --phase test -enable_wandb -log_eval
but there is an error, could you tell me how to fix it? thanks, bro
22-10-10 16:44:17.159 - INFO: Model [DDPM] is created.
22-10-10 16:44:17.159 - INFO: Initial Diffusion Model Finished
22-10-10 16:44:17.591 - INFO: Loading pretrained model for CD model [experiments/ddpm-RS-CDHead-WHU_221008_144806/checkpoint/cd_model_E79] ...
Traceback (most recent call last):
File "ddpm_cd.py", line 96, in
change_detection = Model.create_CD_model(opt)
File "/data/chengxi.han/Sigma124/ddpm-cd/model/init.py", line 13, in create_CD_model
m = M(opt)
File "/data/chengxi.han/Sigma124/ddpm-cd/model/cd_model.py", line 49, in init
self.load_network()
File "/data/chengxi.han/Sigma124/ddpm-cd/model/cd_model.py", line 161, in load_network
network.load_state_dict(torch.load(
File "/data/chengxi.han/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1604, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for cd_head_v2:
size mismatch for decoder.0.block.0.weight: copying a param with shape torch.Size([1024, 3072, 1, 1]) from checkpoint, the shape in cur model is torch.Size([1024, 2048, 1, 1]).
size mismatch for decoder.2.block.0.weight: copying a param with shape torch.Size([1024, 3072, 1, 1]) from checkpoint, the shape in cur model is torch.Size([1024, 2048, 1, 1]).
size mismatch for decoder.4.block.0.weight: copying a param with shape torch.Size([512, 1536, 1, 1]) from checkpoint, the shape in currmodel is torch.Size([512, 1024, 1, 1]).
size mismatch for decoder.6.block.0.weight: copying a param with shape torch.Size([256, 768, 1, 1]) from checkpoint, the shape in curreodel is torch.Size([256, 512, 1, 1]).
size mismatch for decoder.8.block.0.weight: copying a param with shape torch.Size([128, 384, 1, 1]) from checkpoint, the shape in curreodel is torch.Size([128, 256, 1, 1]).
wandb: Waiting for W&B process to finish... (failed 1). Press Control-C to abort syncing.
wandb: Synced jolly-bee-43: https://wandb.ai/sigmahan/ddpm-RS-CDHead/runs/3h7lld6h
wandb: Synced 6 W&B file(s), 0 media file(s), 0 artifact file(s) and 0 other file(s)
wandb: Find logs at: ./experiments/wandb/run-20221010_164400-3h7lld6h/logs
The text was updated successfully, but these errors were encountered:
Hi bro, I want to test whu dataset when I run:
python ddpm_cd.py --config config/whu_test.json --phase test -enable_wandb -log_eval
but there is an error, could you tell me how to fix it? thanks, bro
22-10-10 16:44:17.159 - INFO: Model [DDPM] is created.
22-10-10 16:44:17.159 - INFO: Initial Diffusion Model Finished
22-10-10 16:44:17.591 - INFO: Loading pretrained model for CD model [experiments/ddpm-RS-CDHead-WHU_221008_144806/checkpoint/cd_model_E79] ...
Traceback (most recent call last):
File "ddpm_cd.py", line 96, in
change_detection = Model.create_CD_model(opt)
File "/data/chengxi.han/Sigma124/ddpm-cd/model/init.py", line 13, in create_CD_model
m = M(opt)
File "/data/chengxi.han/Sigma124/ddpm-cd/model/cd_model.py", line 49, in init
self.load_network()
File "/data/chengxi.han/Sigma124/ddpm-cd/model/cd_model.py", line 161, in load_network
network.load_state_dict(torch.load(
File "/data/chengxi.han/anaconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1604, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for cd_head_v2:
size mismatch for decoder.0.block.0.weight: copying a param with shape torch.Size([1024, 3072, 1, 1]) from checkpoint, the shape in cur model is torch.Size([1024, 2048, 1, 1]).
size mismatch for decoder.2.block.0.weight: copying a param with shape torch.Size([1024, 3072, 1, 1]) from checkpoint, the shape in cur model is torch.Size([1024, 2048, 1, 1]).
size mismatch for decoder.4.block.0.weight: copying a param with shape torch.Size([512, 1536, 1, 1]) from checkpoint, the shape in currmodel is torch.Size([512, 1024, 1, 1]).
size mismatch for decoder.6.block.0.weight: copying a param with shape torch.Size([256, 768, 1, 1]) from checkpoint, the shape in curreodel is torch.Size([256, 512, 1, 1]).
size mismatch for decoder.8.block.0.weight: copying a param with shape torch.Size([128, 384, 1, 1]) from checkpoint, the shape in curreodel is torch.Size([128, 256, 1, 1]).
wandb: Waiting for W&B process to finish... (failed 1). Press Control-C to abort syncing.
wandb: Synced jolly-bee-43: https://wandb.ai/sigmahan/ddpm-RS-CDHead/runs/3h7lld6h
wandb: Synced 6 W&B file(s), 0 media file(s), 0 artifact file(s) and 0 other file(s)
wandb: Find logs at: ./experiments/wandb/run-20221010_164400-3h7lld6h/logs
The text was updated successfully, but these errors were encountered: