New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transfer learning: 0 dropout doesn't remove/override existing layer dropout #4368

Closed
AlexDBlack opened this Issue Dec 5, 2017 · 2 comments

Comments

Projects
None yet
1 participant
@AlexDBlack
Copy link
Member

AlexDBlack commented Dec 5, 2017

            netToTest = new TransferLearning.Builder(net)
                    .fineTuneConfiguration(new FineTuneConfiguration.Builder()
                            .dropOut(0.0)
                    .build()).build();

The original dropout is still present on the existing layers. Reason: .dropOut(0.0) does dropOut((IDropOut)null) hence it's treated as equivalent to "not set"

@AlexDBlack AlexDBlack added the Bug label Dec 5, 2017

@AlexDBlack

This comment has been minimized.

Copy link
Member

AlexDBlack commented Dec 5, 2017

This looks to impact more than just dropout: it also impacts weight noise (drop connect, etc) - and by that logic, I expect it'll impact bunch of others stuff too...

@AlexDBlack AlexDBlack self-assigned this Dec 5, 2017

AlexDBlack added a commit that referenced this issue Dec 11, 2017

Merge pull request #4370 from deeplearning4j/ab_4368_transferlearning
#4368 Fix 'override with null' behaviour for FineTuneConfiguration
@lock

This comment has been minimized.

Copy link

lock bot commented Sep 23, 2018

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Sep 23, 2018

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.