DrChainsaw and AlexDBlack Make weight initialization extendable (#6820)
* Added classes for legacy WeightInit variants

* Forgot to make LegacyWeightInitTest pass

* Added testcase for distributions

* Optimized imports

* Add test for output shape of IWeightInit

* Remove temp code

* Add ser/deser

* Change weight initialization to new type in NeuralNetConfiguration and BaseLayer (and some extending classes) and rename to weightInitFn. Also remove Distribution field.

Consequence of the latter is that an exception is thrown if legacy weightInit method is called with WeightInit.DISTRIBUTION as argument.

SameDiff not changed.

* Change API to IWeightInit to use long for fanIn and fanOut instead of double.

* Revert "Change API to IWeightInit to use long for fanIn and fanOut instead of double."

This reverts commit f0c1bec

* Fix javadoc

* Deprecate method dist

* Add no args version of WeightInit.getWeightInitFunction and replaced calls with null
Add null check of distribution in WeightInitDistribution

* Add ID mapping for convolution layers

* Clarify the use of doubles in API description

* Fix weightinit to string
Latest commit 75e3996 Dec 22, 2018