Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weird behavior of LongestMaxSize #2123

Closed
Optimox opened this issue Dec 30, 2022 · 6 comments · Fixed by #2131
Closed

Weird behavior of LongestMaxSize #2123

Optimox opened this issue Dec 30, 2022 · 6 comments · Fixed by #2131
Labels
bug 🐛 Something isn't working help wanted Extra attention is needed module: augmentations

Comments

@Optimox
Copy link

Optimox commented Dec 30, 2022

Describe the bug

Hello me again,

I might be doing something wrong with the way I use kornia augmentations, please let me know if it is the case.

I was expecting LongestMaxSize in kornia to perform similarily as the albumentation implementation. Meaning that I can throw any images with different shapes to the the transformation function and get an image with different shapes but similar ratios. The largest size being equal to the value given to LongestMaxSize.

See bellow a small code sample that disturbs me.

Reproduction steps

import kornia.augmentation as K
a = torch.ones((512, 256))
b = torch.ones((512, 756))

print("first try")
transfo = K.LongestMaxSize(max_size=256, p=1.)

print(transfo(a).shape)
print(transfo(b).shape)

print("second try")

a = torch.ones((512, 256))
b = torch.ones((512, 756))

transfo = K.LongestMaxSize(max_size=256, p=1.)
print(transfo(b).shape)
print(transfo(a).shape)

Outputs:
first try
torch.Size([1, 1, 256, 128])
torch.Size([1, 1, 256, 128])
second try
torch.Size([1, 1, 173, 256])
torch.Size([1, 1, 173, 256])

Expected behavior

I would expect to have the same values for the transformations no matter the order of the elements.

ie transfo(a).shape == torch.Size([1, 1, 256, 128]) and transfo(b).shape ==torch.Size([1, 1, 173, 256])

Am I missing something here ?

Environment

kornia='0.6.9'
torch='1.12.1+cu113'

Additional context

No response

@Optimox Optimox added the help wanted Extra attention is needed label Dec 30, 2022
@Optimox
Copy link
Author

Optimox commented Dec 30, 2022

Seems like Resize behave similarly when size is an integer.

@Optimox
Copy link
Author

Optimox commented Jan 4, 2023

@johnnv1 @shijianjian Is this the expected behavior ?

@shijianjian
Copy link
Member

@twsl Can you take a look?

@twsl
Copy link
Contributor

twsl commented Jan 4, 2023

Unfortunately I don't have much time right now, but a quick peek at the source code makes me wonder if the resize generator in the resize augmentation does include any checks to recalculate the parameters

@shijianjian
Copy link
Member

@johnnv1 Can you help fix it if you got some time?

@johnnv1
Copy link
Member

johnnv1 commented Jan 9, 2023

as @twsl commented, seems for this case we need to recompute the params -- i'm not sure of the best way to try avoid generating the params in each call

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐛 Something isn't working help wanted Extra attention is needed module: augmentations
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants