Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Same lambdas(drawn from beta dist.) applied to multiple data in the same mini-batch #4

Closed
ildoonet opened this issue Jul 17, 2019 · 1 comment

Comments

@ildoonet
Copy link

https://github.com/clovaai/CutMix-PyTorch/blob/master/train.py#L233

Can it be improved if we use different(diverse) lambdas for each data?

@hellbell
Copy link
Collaborator

@ildoonet
Thank you for a good suggestion.
Similar to the answer for #3 (comment) , we consider the efficiency for loading data and also simplicity of implementation, so we fix the lambda and the position of the random region for all the images in the mini-batch.
We didn't check the performance improvement when we change the lambda and cropping position, we will be very happy if you train CutMix with that mechanism.
After verifying, please PR with configure option (e.g. '--diverse-lambda' or '--diverse-position').

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants