Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integer overflow on large number of parameters #6611

DanielWieczorek opened this issue Oct 18, 2018 · 2 comments


Copy link

commented Oct 18, 2018

While experimenting with my NN configuration, I stumbled over a NullPointerException during initialization of the MultiLayerNetwork:

java.lang.NullPointerException: null
	at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.init( ~[deeplearning4j-nn-1.0.0-beta2.jar:?]
	at org.deeplearning4j.nn.multilayer.MultiLayerNetwork.init( ~[deeplearning4j-nn-1.0.0-beta2.jar:?]
	at ~[classes/:?]
	at de.wieczorek.nn.AbstractNeuralNetwork.train( ~[classes/:?]
	at$Proxy$_$$_WeldClientProxy.train(Unknown Source) ~[classes/:?]
	at [classes/:?]
	at$Proxy$_$$ Source) [classes/:?]
	at [classes/:?]
	at java.util.concurrent.Executors$ [?:?]
	at [?:?]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ [?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker( [?:?]
	at java.util.concurrent.ThreadPoolExecutor$ [?:?]
	at [?:?]

Apparently the number of parameters was too large. This code segment looks a bit fishy:

            int paramLength = 0;
            val nParamsPerLayer = new long[nLayers];
            for (int i = 0; i < nLayers; i++) {
                NeuralNetConfiguration conf = layerWiseConfigurations.getConf(i);
                nParamsPerLayer[i] = conf.getLayer().initializer().numParams(conf);
                paramLength += nParamsPerLayer[i];

The parameters are collected in a list of longs and accumulated in an int. Also the subsequent handling does not regard values < 0 for paramLength. This causes that flattenedParams is never initialized since it is assumed that there are no parameters.

No matter if my config makes sense, I would expect an Exception telling me to rethink my network design.


This comment has been minimized.

Copy link

commented Oct 18, 2018

Yes, it should either work or should provide a useful exception.
ND4J supports long indexing, so there's no technical reason why we can't have nets with more than Integer.MAX_VALUE (~2.1 billion) parameters. Whether that makes sense (should be allowed) or is usually the sign of an error (should be disallowed) is another question.

@AlexDBlack AlexDBlack added the DL4J label Oct 18, 2018

@AlexDBlack AlexDBlack self-assigned this Oct 25, 2018

AlexDBlack added a commit that referenced this issue Oct 25, 2018
AlexDBlack added a commit that referenced this issue Oct 26, 2018
DL4J Issues/Fixes (#6634)
* #6611 Allow nets with more than Integer.MAX_VALUE params; fixes for indexing in large arrays

* Ignore annotation on new tests due to memory requirements

* #6609 Android compatibility for VersionCheck (remove use of SimpleFileVisitor)

* #6619 BarnesHutTsne.saveAsFile tweak

* #6632 Nd4j readTxt validation + extra tests

* Different fix for #6609

* Small stats listener tweak

* Small test fix

This comment has been minimized.

Copy link

commented Nov 25, 2018

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Nov 25, 2018

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
None yet
2 participants
You can’t perform that action at this time.