Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing 1 required positional argument: 'temperature'? #28

Open
ZeroOneGame opened this issue Aug 26, 2021 · 3 comments
Open

Missing 1 required positional argument: 'temperature'? #28

ZeroOneGame opened this issue Aug 26, 2021 · 3 comments

Comments

@ZeroOneGame
Copy link

ZeroOneGame commented Aug 26, 2021

I got this error:

But when i check the argument temperature in search.yaml , it does. So could someone give me a hand?

My platform is : Ubuntu18.04 with 4*2080Ti GPU, pytorch1.6.

(Tom_envs) Tom@Tom:~/code/Tom/Seg_AutoAlbument$ autoalbument-search --config-dir /home/Tom/code/Tom/Seg_AutoAlbument/sub_Seg_data_aug
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/autoalbument/cli/search.py:15: UserWarning: register_resolver() is deprecated.
See https://github.com/omry/omegaconf/issues/426 for migration instructions.

  OmegaConf.register_resolver("config_dir", get_config_dir)
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'hydra/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'logger/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'callbacks/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'optim/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'trainer/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'searcher/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'data/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'semantic_segmentation_model/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'classification_model/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/core/default_element.py:126: UserWarning: In 'policy_model/default': Usage of deprecated keyword in package header '# @package _group_'.
See https://hydra.cc/docs/next/upgrades/1.0_to_1.1/changes_to_package_header for more information
  See {url} for more information"""
_version: 2
task: semantic_segmentation
policy_model:
  task_factor: 0.1
  gp_factor: 10
  temperature: 0.05
  num_sub_policies: 25
  num_chunks: 4
  operation_count: 4
  operations:
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.ShiftRGB
    shift_r: true
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.ShiftRGB
    shift_g: true
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.ShiftRGB
    shift_b: true
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.RandomBrightness
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.RandomContrast
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.Solarize
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.HorizontalFlip
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.VerticalFlip
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.Rotate
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.ShiftX
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.ShiftY
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.Scale
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.CutoutFixedNumberOfHoles
  - _target_: autoalbument.faster_autoaugment.models.policy_operations.CutoutFixedSize
semantic_segmentation_model:
  _target_: autoalbument.faster_autoaugment.models.SemanticSegmentationModel
  num_classes: 3
  architecture: Unet
  encoder_architecture: resnet18
  pretrained: true
data:
  dataset:
    _target_: dataset.SearchDataset
  input_dtype: uint8
  preprocessing: null
  normalization:
    mean:
    - 0.485
    - 0.456
    - 0.406
    std:
    - 0.229
    - 0.224
    - 0.225
  dataloader:
    _target_: torch.utils.data.DataLoader
    batch_size: 4
    shuffle: true
    num_workers: 8
    pin_memory: true
    drop_last: true
searcher:
  _target_: autoalbument.faster_autoaugment.search.FasterAutoAugmentSearcher
trainer:
  _target_: pytorch_lightning.Trainer
  gpus: 3
  benchmark: true
  max_epochs: 20
  resume_from_checkpoint: null
optim:
  main:
    _target_: torch.optim.Adam
    lr: 0.001
    betas:
    - 0
    - 0.999
  policy:
    _target_: torch.optim.Adam
    lr: 0.001
    betas:
    - 0
    - 0.999
callbacks:
- _target_: autoalbument.callbacks.MonitorAverageParameterChange
- _target_: autoalbument.callbacks.SavePolicy
- _target_: pytorch_lightning.callbacks.ModelCheckpoint
  save_last: true
  dirpath: checkpoints
logger:
  _target_: pytorch_lightning.loggers.TensorBoardLogger
  save_dir: /home/Tom/code/Tom/Seg_AutoAlbument/sub_Seg_data_aug/outputs/2021-08-26/19-55-28/tensorboard_logs
seed: 42

Working directory: /home/Tom/code/Tom/Seg_AutoAlbument/sub_Seg_data_aug/outputs/2021-08-26/19-55-28
Error executing job with overrides: []
Traceback (most recent call last):
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 62, in _call_target
    return _target_(*args, **kwargs)
TypeError: __init__() missing 1 required positional argument: 'temperature'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/autoalbument/cli/search.py", line 54, in main
    searcher = instantiate(cfg.searcher, cfg=cfg)
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 180, in instantiate
    return instantiate_node(config, *args, recursive=_recursive_, convert=_convert_)
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 246, in instantiate_node
    value, convert=convert, recursive=recursive
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 267, in instantiate_node
    value, convert=convert, recursive=recursive
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 267, in instantiate_node
    value, convert=convert, recursive=recursive
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 225, in instantiate_node
    for item in node._iter_ex(resolve=True)
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 225, in <listcomp>
    for item in node._iter_ex(resolve=True)
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 249, in instantiate_node
    return _call_target(_target_, *args, **kwargs)
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 66, in _call_target
    ).with_traceback(sys.exc_info()[2])
  File "/home/Tom/anaconda3/envs/Tom_envs/lib/python3.7/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 62, in _call_target
    return _target_(*args, **kwargs)
TypeError: Error instantiating 'autoalbument.faster_autoaugment.models.policy_operations.ShiftRGB' : __init__() missing 1 required positional argument: 'temperature'

Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
(Tom_envs) Tom@Tom:~/code/Tom/Seg_AutoAlbument$

Here is my search.yaml:

# @package _global_

_version: 2  # An internal value that indicates a version of the config schema. This value is used by
# `autoalbument-search` and `autoalbument-migrate` to upgrade the config to the latest version if necessary.
# Please do not change it manually.


task: semantic_segmentation # Deep learning task. Should either be `classification` or `semantic_segmentation`.


policy_model:
  # Settings for Policy Model that searches augmentation policies.

  temperature: 0.05
  # Temperature for Relaxed Bernoulli distribution. The probability of applying a certain augmentation is sampled from
  # Relaxed Bernoulli distribution (because Bernoulli distribution is not differentiable). With lower values of
  # `temperature` Relaxed Bernoulli distribution behaves like Bernoulli distribution. In the paper, the authors
  # of Faster AutoAugment used 0.05 as a default value for `temperature`.

  task_factor: 0.1
  # Multiplier for segmentation loss of a model. Faster AutoAugment uses segmentation loss to prevent augmentations
  # from transforming images of a particular class to another class.

  gp_factor: 10
  # Multiplier for the gradient penalty for WGAN-GP training. 10 is the default value that was proposed in
  # `Improved Training of Wasserstein GANs`.


  num_sub_policies: 25
  # Number of augmentation sub-policies. When an image passes through an augmentation pipeline, Faster AutoAugment
  # randomly chooses one sub-policy and uses augmentations from that sub-policy to transform an input image. A larger
  # number of sub-policies leads to a more diverse set of augmentations and better performance of a model trained on
  # augmented images. However, an increase in the number of sub-policies leads to the exponential growth of a search
  # space of augmentations, so you need more training data for Policy Model to find good augmentation policies.

  num_chunks: 4
  # Number of chunks in a batch. Faster AutoAugment splits each batch of images into `num_chunks` chunks. Then it
  # applies the same sub-policy with the same parameters to each image in a chunk. This parameter controls the tradeoff
  # between the speed of augmentation search and diversity of augmentations. Larger `num_chunks` values will lead to
  # faster searching but less diverse set of augmentations. Note that this parameter is used only in the searching
  # phase. When you train a model with found sub-policies, Albumentations will apply a distinct set of transformations
  # to each image separately.

  operation_count: 4
  # Number of consecutive augmentations in each sub-policy. Faster AutoAugment will sequentially apply `operation_count`
  # augmentations from a sub-policy to an image. Larger values of `operation_count` lead to better performance of
  # a model trained on augmented images. Simultaneously, larger values of `operation_count` affect the speed of search
  # and increase the searching time.

semantic_segmentation_model:
# Settings for Semantic Segmentation Model that is used for two purposes:
# 1. As a model that performs semantic segmentation of input images.
# 2. As a Discriminator for Policy Model.
  _target_: autoalbument.faster_autoaugment.models.SemanticSegmentationModel
    # By default, AutoAlbument uses an instance of `autoalbument.faster_autoaugment.models.SemanticSegmentationModel` as
  # a semantic segmentation model.
  # This model takes four parameters: `num_classes`, `architecture`, `encoder_architecture` and `pretrained`.

  num_classes: 3
  # The number of classes in the dataset. The dataset implementation should return a mask as a NumPy array with
  # the shape [height, width, num_classes]. In a case of binary segmentation you can set `num_classes` to 1.

  architecture: Unet
  # The architecture of Semantic Segmentation Model. AutoAlbument uses models from
  # https://github.com/qubvel/segmentation_models.pytorch. Please refer to its documentation to get a list of available
  # models - https://github.com/qubvel/segmentation_models.pytorch#models-.

  encoder_architecture: resnet18
  # The architecture of encoder in Semantic Segmentation Model. Please refer to Segmentation Models' documentation to
  # get a list of available encoders - https://github.com/qubvel/segmentation_models.pytorch#encoders-

  pretrained: true
  # Either boolean flag or string with that indicates whether the selected encoder architecture should load pretrained
  # weights or use randomly initialized weights.
  # - In the case of boolean flag `true` means using pretrained weights from ImageNet and `false` means using randomly
  #   initialized weights.
  # - In the case of string the value should specify the name of the weights. For the list of available weights please
  #   refer to https://github.com/qubvel/segmentation_models.pytorch#encoders-


data:
  dataset:
    _target_: dataset.SearchDataset
  # Class for instantiating a PyTorch dataset.

  normalization:
    mean: [0.485, 0.456, 0.406]
    std: [0.229, 0.224, 0.225]
  # Normalization values for images. For each image, the search pipeline will subtract `mean` and divide by `std`.
  # Normalization is applied after transforms defined in `preprocessing`. Note that regardless of `input_dtype`,
  # the normalization function will always receive a `float32` input with values in the range [0.0, 1.0], so you should
  # define `mean` and `std` values accordingly. ImageNet normalization is used by default.


  dataloader:
    batch_size: 4
trainer:
  gpus: 3
  # Number of GPUs to train on. Set to `0` or None` to use CPU for training.
  # More detailed description - https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html#gpus

  benchmark: true
  # If true enables cudnn.benchmark.
  # More detailed description - https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html#benchmark

  max_epochs: 20
  # Number of epochs to search for augmentation parameters.
  # More detailed description - https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html#max-epochs
@ZeroOneGame ZeroOneGame changed the title Missing 1 required positional argument: 'temperature' , but it does? Missing 1 required positional argument: 'temperature'? Aug 26, 2021
@PickHub
Copy link

PickHub commented Sep 2, 2021

Think this is a hydra-core issue. Got fixed for me by
pip install hydra-core==1.0.6

@ihamdi
Copy link

ihamdi commented Nov 23, 2021

This is what I had to do to get the cifar10 example to work:

  • Upgrade Python from 3.6.13 to 3.7.11
  • Downgrade Pytorch from 1.10.0 to 1.8.0
  • Downgrade Hydra-Core from 1.1.0 to 1.0.6
  • Change line 6 in envs/name/lib/site-packages/timm/models/layers/helper.py from "from torch._six import container_abcs" to "import collections.abc as container_abcs"

@fbosshard
Copy link

With the release of segmentation-models-pytorch version 0.3.0 on 29.07.22, a new error occurs

cannot import name 'ByoModelCfg' from 'timm.models' (/home/user/.venv/envname/lib/python3.8/site-packages/timm/models/__init__.py) when loading module 'autoalbument.faster_autoaugment.search.FasterAutoAugmentSearcher

This can be fixed by using the older version:
pip install segmentation-models-pytorch==0.2.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants