Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KeyError: 'gaussiannoise_channels' #17

Open
borisandeva opened this issue Dec 21, 2023 · 0 comments
Open

KeyError: 'gaussiannoise_channels' #17

borisandeva opened this issue Dec 21, 2023 · 0 comments

Comments

@borisandeva
Copy link

Hi, I was trying to follow the semi-supervised example (dataset AtriaSeg). When I ran the following command I got an error.

 pymic_train config/unet3d_r10_em.cfg
torch.backends.cuda.matmul.allow_tf32 = True by default.
  This value defaults to True when PyTorch version in [1.7, 1.11] and may affect precision.
  See https://docs.monai.io/en/latest/precision_accelerating.html#precision-and-accelerating
dataset tensor_type float
dataset task_type seg
dataset supervise_type semi_sup
dataset root_dir ../../PyMIC_data/AtriaSeg/TrainingSet_crop/
dataset train_csv config/data/image_train_r10_lab.csv
dataset train_csv_unlab config/data/image_train_r10_unlab.csv
dataset valid_csv config/data/image_valid.csv
dataset test_csv config/data/image_test.csv
dataset train_batch_size 2
dataset train_batch_size_unlab 2
dataset train_transform ['RandomCrop', 'RandomFlip', 'NormalizeWithMeanStd', 'GammaCorrection', 'GaussianNoise', 'LabelToProbability']

dataset train_transform_unlab ['RandomCrop', 'RandomFlip', 'NormalizeWithMeanStd', 'GammaCorrection', 'GaussianNoise']
dataset valid_transform ['NormalizeWithMeanStd', 'LabelToProbability']
dataset test_transform ['NormalizeWithMeanStd']
dataset randomcrop_output_size [72, 96, 112]
dataset randomcrop_foreground_focus False
dataset randomcrop_foreground_ratio None
dataset randomcrop_mask_label None
dataset randomflip_flip_depth False
dataset randomflip_flip_height True
dataset randomflip_flip_width True
dataset normalizewithmeanstd_channels [0]
dataset gammacorrection_channels [0]
dataset gammacorrection_gamma_min 0.7
dataset gammacorrection_gamma_max 1.5
network net_type UNet3D
network class_num 2
network in_chns 1
network feature_chns [32, 64, 128, 256]
network dropout [0.0, 0.0, 0.5, 0.5]
network trilinear True
network multiscale_pred False
training gpus [1]
training loss_type ['DiceLoss', 'CrossEntropyLoss']
training loss_weight [0.5, 0.5]
training optimizer Adam
training learning_rate 0.001
training momentum 0.9
training weight_decay 1e-05
training lr_scheduler ReduceLROnPlateau
training lr_gamma 0.5
training reducelronplateau_patience 2000
training early_stop_patience 5000
training ckpt_save_dir model/unet3d_r10_em
training iter_max 20000
training iter_valid 100
training iter_save [1000, 20000]
semi_supervised_learning method_name EntropyMinimization
semi_supervised_learning regularize_w 0.1
semi_supervised_learning rampup_start 1000
semi_supervised_learning rampup_end 15000
testing gpus [1]
testing ckpt_mode 1
testing output_dir result/unet3d_r10_em
testing post_process None
testing sliding_window_enable False
dataset tensor_type = float
dataset task_type = seg
dataset supervise_type = semi_sup
dataset root_dir = ../../PyMIC_data/AtriaSeg/TrainingSet_crop/
dataset train_csv = config/data/image_train_r10_lab.csv
dataset train_csv_unlab = config/data/image_train_r10_unlab.csv
dataset valid_csv = config/data/image_valid.csv
dataset test_csv = config/data/image_test.csv
dataset train_batch_size = 2
dataset train_batch_size_unlab = 2
dataset train_transform = ['RandomCrop', 'RandomFlip', 'NormalizeWithMeanStd', 'GammaCorrection', 'GaussianNoise', 'LabelToProbability']
dataset train_transform_unlab = ['RandomCrop', 'RandomFlip', 'NormalizeWithMeanStd', 'GammaCorrection', 'GaussianNoise']
dataset valid_transform = ['NormalizeWithMeanStd', 'LabelToProbability']
dataset test_transform = ['NormalizeWithMeanStd']
dataset randomcrop_output_size = [72, 96, 112]
dataset randomcrop_foreground_focus = False
dataset randomcrop_foreground_ratio = None
dataset randomcrop_mask_label = None
dataset randomflip_flip_depth = False
dataset randomflip_flip_height = True
dataset randomflip_flip_width = True
dataset normalizewithmeanstd_channels = [0]
dataset gammacorrection_channels = [0]
dataset gammacorrection_gamma_min = 0.7
dataset gammacorrection_gamma_max = 1.5
dataset labeltoprobability_class_num = 2
network net_type = UNet3D
network class_num = 2
network in_chns = 1
network feature_chns = [32, 64, 128, 256]
network dropout = [0.0, 0.0, 0.5, 0.5]
network trilinear = True
network multiscale_pred = False
training gpus = [1]
training loss_type = ['DiceLoss', 'CrossEntropyLoss']
training loss_weight = [0.5, 0.5]
training optimizer = Adam
training learning_rate = 0.001
training momentum = 0.9
training weight_decay = 1e-05
training lr_scheduler = ReduceLROnPlateau
training lr_gamma = 0.5
training reducelronplateau_patience = 2000
training early_stop_patience = 5000
training ckpt_save_dir = model/unet3d_r10_em
training iter_max = 20000
training iter_valid = 100
training iter_save = [1000, 20000]
semi_supervised_learning method_name = EntropyMinimization
semi_supervised_learning regularize_w = 0.1
semi_supervised_learning rampup_start = 1000
semi_supervised_learning rampup_end = 15000
testing gpus = [1]
testing ckpt_mode = 1
testing output_dir = result/unet3d_r10_em
testing post_process = None
testing sliding_window_enable = False

********** Semi Supervised Learning **********

deterministric is true
Traceback (most recent call last):
  File "/gpu_home/bori/miniconda3/envs/pymic3/bin/pymic_train", line 8, in <module>
    sys.exit(main())
  File "/gpu_home/bori/miniconda3/envs/pymic3/lib/python3.9/site-packages/pymic/net_run/train.py", line 95, in main
    agent.run()
  File "/gpu_home/bori/miniconda3/envs/pymic3/lib/python3.9/site-packages/pymic/net_run/agent_abstract.py", line 311, in run
    self.create_dataset()
  File "/gpu_home/bori/miniconda3/envs/pymic3/lib/python3.9/site-packages/pymic/net_run/semi_sup/ssl_abstract.py", line 64, in create_dataset
    super(SSLSegAgent, self).create_dataset()
  File "/gpu_home/bori/miniconda3/envs/pymic3/lib/python3.9/site-packages/pymic/net_run/agent_abstract.py", line 247, in create_dataset
    self.train_set = self.get_stage_dataset_from_config('train')
  File "/gpu_home/bori/miniconda3/envs/pymic3/lib/python3.9/site-packages/pymic/net_run/agent_seg.py", line 61, in get_stage_dataset_from_config
    one_transform = self.transform_dict[name](transform_param)
  File "/gpu_home/bori/miniconda3/envs/pymic3/lib/python3.9/site-packages/pymic/transform/intensity.py", line 103, in __init__
    self.channels = params['GaussianNoise_channels'.lower()]
KeyError: 'gaussiannoise_channels'

I had to updgrade pytorch to a newer one due to my cuda version 11.6. Not it is:
torch==1.11.0+cu113
torchvision==0.12.0+cu113
pymic==0.4.0

and Python is 3.9

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant