New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TTA details #24
Comments
Hi, I'll release the proper code for this soon. But here is the list of augmentations(and their inverses) for now.
|
From the code above, flip and rotate operations are the main augmentations that used in TTA. Thanks for your early reply. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The effect of your proposed model and training method is really amazing. “This, combined with longer training at 45 epochs and TTA, has a bigger impact than the choices of loss functions”. It seems TTA also plays an important role. So I want to konw the detail of TTA, such as the example code or related info. Looking forward to your reply!
The text was updated successfully, but these errors were encountered: