Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it your idea to add batch normalization and replace upconv with bilinear upsample? #55

Closed
fanfanda opened this issue May 4, 2019 · 4 comments

Comments

@fanfanda
Copy link

fanfanda commented May 4, 2019

No description provided.

@milesial
Copy link
Owner

milesial commented May 4, 2019

Hi, yes, the bilinear upsample is a way to have the model use less memory, even if the results are not as good. You can use upconv with bilinear=False in the net construction.
About the batch norm, I believe there was some discussion on the Kaggle forums and kernels about it, so I decided to add it.

@fanfanda
Copy link
Author

fanfanda commented May 4, 2019

Thanks for your reply. In my recent paper experiment, I retained two of your changes. Do I need to cite your project? I hope you can contact me by email. My email is (fanfanda@ict.ac.cn). Thank you again.

@milesial
Copy link
Owner

milesial commented May 4, 2019

If you wish to cite me you can link the project URL, but I have to tell you that these changes are not backed by any theoretical study, I did it mainly because my GPU did not have enough memory :)
Make sure to follow the GNU GPL v3 license.

@fanfanda
Copy link
Author

fanfanda commented May 5, 2019

ok, thanks~

@milesial milesial closed this as completed May 5, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants