Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

augmentation and regularization used in MLP-Mixer #5

Closed
Ga-Lee opened this issue Jun 16, 2022 · 2 comments
Closed

augmentation and regularization used in MLP-Mixer #5

Ga-Lee opened this issue Jun 16, 2022 · 2 comments

Comments

@Ga-Lee
Copy link

Ga-Lee commented Jun 16, 2022

Hello! Thank you for your work!
In the paper of MLP-Mixer, when training mixer-B/16 on imagenet1k from scratch, it is said extra regularization is applied to gain the the accuracy of 76% , I want to know what detailed augmentation and regularization strategy is used for the experiment? Is there any config file can be found?
Thank you for your help! : )

@akolesnikoff
Copy link
Collaborator

Hi!

I've just submitted the MLP-Mixer code, as well as the configuration that achieves 76% top-1 ImageNet with MLP-Mixer-B/16 model: https://github.com/google-research/big_vision/blob/main/big_vision/configs/mlp_mixer_i1k.py. Let me know if you have any further questions.

@Ga-Lee
Copy link
Author

Ga-Lee commented Jun 28, 2022

Hi!

I've just submitted the MLP-Mixer code, as well as the configuration that achieves 76% top-1 ImageNet with MLP-Mixer-B/16 model: https://github.com/google-research/big_vision/blob/main/big_vision/configs/mlp_mixer_i1k.py. Let me know if you have any further questions.

Got it !
Thank you for your reply!

@Ga-Lee Ga-Lee closed this as completed Jun 28, 2022
@google-research google-research locked and limited conversation to collaborators Nov 7, 2023
@lucasb-eyer lucasb-eyer converted this issue into discussion #71 Nov 7, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants