Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhance] Random generator refactor #1459

Merged
merged 44 commits into from
Dec 2, 2021

Conversation

shijianjian
Copy link
Member

@shijianjian shijianjian commented Nov 16, 2021

Changes

This PR refactored the random generator module, mainly for dragging all param validation, distribution generation into __init__ which reduced the unnecessary object reinitialization.

Benchmark

Here we benchmarked with Torchvision default CPU augmentation and Kornia GPU augmentation, on Google Colab K80 GPU with different batch sizes. unit=ms, and bs means batch size.

The results recorded mean per-sample augmentation speed. Kornia got a better results when larger batch size.

ops torchvision kornia (bs=1)
RandomPerspective 4.88±1.82 4.74±2.84
ColorJitter 4.40±2.88 4.14±3.85
RandomAffine 3.12±5.80 3.01±7.80
RandomVerticalFlip 0.32±0.08 0.35±0.82
RandomHorizontalFlip 0.32±0.08 0.31±0.59
RandomRotate 1.82±4.70 1.58±4.44
RandomCrop 4.09±3.41 3.84±3.07
RandomErasing 2.31±1.47 2.32±3.31
RandomGrayscale 0.41±0.18 0.45±1.20
RandomResizedCrop 4.23±2.86 4.07±2.67
RandomCenterCrop 2.93±1.29 2.88±2.34
ops kornia (bs=128) THIS_PR (bs=128) kornia (bs=32) THIS_PR (bs=32)
RandomPerspective 0.19±2.31 0.20±27.00 0.35±2.06ms 0.37±2.67ms
ColorJitter 0.84±12.85 0.83±12.96 0.89±17.12ms 0.90±24.68ms
RandomAffine 0.18±7.34 0.18±6.30 0.30±4.23ms 0.30±4.39ms
RandomVerticalFlip 0.01±0.49 0.01±0.35 0.02±0.11ms 0.02±0.13ms
RandomHorizontalFlip 0.01±2.29 0.01±0.37 0.02±1.00ms 0.01±0.26ms
RandomRotate 0.18±28.17 0.17±5.69 0.25±3.33ms 0.25±2.09ms
RandomCrop 0.07±2.76 0.08±9.42 0.17±6.32ms 0.16±1.17ms
RandomErasing 0.56±7.02 0.57±9.74 0.45±4.78ms 0.44±2.82ms
RandomGrayscale 0.03±0.10 0.03±7.10 0.03±0.13ms 0.03±0.11ms
RandomResizedCrop 0.13±6.88 0.13±8.04 0.23±4.02ms 0.23±5.27ms
RandomCenterCrop 0.07±4.98 0.07±9.41 0.13±2.55ms 0.13±2.20ms

Comment on lines 124 to 123
if torch_version_geq(1, 10) and "cuda" in str(device):
pytest.skip("AssertionError: Tensor-likes are not close!")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To investigate.

@shijianjian
Copy link
Member Author

@edgarriba We need to take flake8 down to < 4.0, check this: PyCQA/flake8#1419

Copy link
Member

@edgarriba edgarriba left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we improve in this PR the interface to reuse parameters between calls ? I think it's something that's continuously asked.

docs/source/augmentation.rst Show resolved Hide resolved
@shijianjian shijianjian marked this pull request as ready for review November 26, 2021 05:57
@shijianjian
Copy link
Member Author

@edgarriba Please approve and merge. Tests failed on flake8 bugs.

Copy link
Member

@edgarriba edgarriba left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What’s the exact issue with flake here ? We can eventually fix version but better to find the reason

@shijianjian
Copy link
Member Author

shijianjian commented Nov 26, 2021

@edgarriba We need to take flake8 down to < 4.0, check this: PyCQA/flake8#1419

This is the issue. It will probably be fixed in the next flake8 version.

@edgarriba
Copy link
Member

@shijianjian then downgrade flake version

Copy link
Member

@edgarriba edgarriba left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Few things:

  • strings we should stick to double ""
  • Coverage lowers quite a bit in this PR
  • try also to lower deep source errors (too many)
  • would be great if the random samplers can be customized

kornia/augmentation/base.py Outdated Show resolved Hide resolved
kornia/augmentation/base.py Outdated Show resolved Hide resolved
kornia/augmentation/base.py Outdated Show resolved Hide resolved
kornia/augmentation/base.py Outdated Show resolved Hide resolved
To apply the exact augmenation again, you may take the advantage of the previous parameter state:
>>> input = torch.randn(1, 3, 32, 32)
>>> aug = RandomPerspective(0.5, p=1.)
>>> (aug(input) == aug(input, params=aug._params)).all()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How’s it working right now. Are the parameters generated before nowing the input shape?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. it has to be the same shaped inputs. But this code has decoupled that logic that allowed easier refactoring later.

kornia/augmentation/augmentation.py Outdated Show resolved Hide resolved
kornia/augmentation/random_generator/random_generator.py Outdated Show resolved Hide resolved
else:
raise TypeError(f"Unsupported type: {type(self.kernel_size)}")

self.angle_sampler = Uniform(angle[0], angle[1], validate_args=False)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to customise the different samplers ? Not only here

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. You can override the make_samplers function for doing so.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

worth to mention that in docs or add an example (or and later PR)

kornia/augmentation/random_generator/random_generator3d.py Outdated Show resolved Hide resolved
kornia/augmentation/random_generator/random_generator.py Outdated Show resolved Hide resolved
@shijianjian
Copy link
Member Author

  • would be great if the random samplers can be customized

This PR only refactored the whole code base, and allows easier customization later. As a backlog, some ideas like:

  • Customizing distribution samplers.
  • make geometric transformations shape-agnostic.
  • Easier lambda augmentations.
  • Refactor mix augmentations (removing the label) for other data types like masks, key points. Improve its loss computation.
  • More 3D augmentations.

@edgarriba edgarriba merged commit e3f0068 into kornia:master Dec 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants