Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

java.lang.IllegalStateException: Cannot perform backprop: Dropout mask array is absent (already cleared?) #6756

wangfeng-skymind opened this issue Nov 23, 2018 · 3 comments


Copy link

commented Nov 23, 2018

Description : this deeplearning4j version is 1.0.0-beta3
Run some code , and then show some exception, below code
is only a little part ==================================
ZooModel zooModel = ResNet50.builder().workspaceMode(WorkspaceMode.NONE).build();
ComputationGraph resnet50 = (ComputationGraph) zooModel.initPretrained();

    FineTuneConfiguration fineTuneConf = new FineTuneConfiguration.Builder()
            .updater(new Nesterovs(5e-5))

    graph = new TransferLearning.GraphBuilder(resnet50)
            .setInputTypes(InputType.convolutional(height, width, channels))
            .addLayer("conv1", new ConvolutionLayer.Builder(new int[]{3, 3})
                    .nIn(channels).nOut(64).activation( Activation.RELU).build(), "input_1")
            .addLayer("fc2048",new DenseLayer.Builder().activation(Activation.TANH).nIn(1000).nOut(2048).build(),"fc1000")
            .addLayer("newOutput",new OutputLayer

Exception message=============================================
11:02:48.307 [AMDSI prefetch thread] DEBUG org.nd4j.linalg.memory.abstracts.Nd4jWorkspace - Steps: 4
Exception in thread "main" java.lang.IllegalStateException: Cannot perform backprop: Dropout mask array is absent (already cleared?)
at org.nd4j.base.Preconditions.throwStateEx(
at org.nd4j.base.Preconditions.checkState(
at org.deeplearning4j.nn.conf.dropout.Dropout.backprop(
at org.deeplearning4j.nn.layers.AbstractLayer.backpropDropOutIfPresent(
at org.deeplearning4j.nn.layers.convolution.subsampling.SubsamplingLayer.backpropGradient(
at org.deeplearning4j.nn.graph.vertex.impl.LayerVertex.doBackward(
at org.deeplearning4j.nn.graph.ComputationGraph.calcBackpropGradients(
at org.deeplearning4j.nn.graph.ComputationGraph.computeGradientAndScore(
at org.deeplearning4j.nn.graph.ComputationGraph.computeGradientAndScore(
at org.deeplearning4j.optimize.solvers.BaseOptimizer.gradientAndScore(
at org.deeplearning4j.optimize.solvers.StochasticGradientDescent.optimize(
at org.deeplearning4j.optimize.Solver.optimize(
at org.deeplearning4j.nn.graph.ComputationGraph.fitHelper(


This comment has been minimized.

Copy link

commented Nov 24, 2018

Thanks for reporting. Fixed here: #6758
Will be merged once tests are complete. A few hours after the merge, fix should be accessible on snapshots:

Alternatively, here's a workaround for 1.0.0-beta3:

        for(Layer l : graph.getLayers()){
            org.deeplearning4j.nn.conf.layers.Layer conf = (org.deeplearning4j.nn.conf.layers.Layer)l.getConfig();
            if(!(l.getConfig() instanceof org.deeplearning4j.nn.conf.layers.BaseLayer)){
            } else {
                IDropout d = conf.getIDropout();
                conf.setIDropout(d == null ? null : d.clone());


This comment has been minimized.

Copy link

commented Nov 24, 2018



This comment has been minimized.

Copy link

commented Dec 24, 2018

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Dec 24, 2018

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
None yet
2 participants
You can’t perform that action at this time.