v0.5.0
MLJFlux v0.5.0
- (new model) Add
NeuralNetworkBinaryClasssifier, an optimised form ofNeuralNetworkClassifierfor the special case of two target classes. UseFlux.σinstead ofsoftmaxfor the default finaliser (#248) - (internals) Switch from implicit to explicit differentiation (#251)
- (breaking) Use optimisers from Optimisers.jl instead of Flux.jl (#251). Note that the new optimisers are immutable.
- (RNG changes.) Change the default value of the model field
rngfromRandom.GLOBAL_RNGtoRandom.default_rng(). Change the seeded RNG, obtained by specifying an integer value forrng, fromMersenneTwistertoXoshiro(#251) - (RNG changes.) Update the
Shortbuilder so that therngargument ofbuild(::Short, rng, ...)
is passed on to theDropoutlayer, as these layers now support this on a GPU, at
least forrng=Random.default_rng()(#251) - (weakly breaking) Change the implementation of L1/L2 regularization from explicit loss penalization to weight/sign decay (internally chained with the user-specified optimiser). The only breakage for users is that the losses reported in the history will no longer be penalized, because the penalty is not explicitly computed (#251)
Merged pull requests:
- Fix metalhead breakage (#250) (@ablaom)
- Omnibus PR, including switch to explicit style differentiation (#251) (@ablaom)
- 🚀 Instate documentation for MLJFlux (#252) (@EssamWisam)
- Update examples/MNIST Manifest, including Julia 1.10 (#254) (@ablaom)
- ✨ Add 7 workflow examples for MLJFlux (#256) (@EssamWisam)
- Add binary classifier (#257) (@ablaom)
- For a 0.5.0 release (#259) (@ablaom)
- Add check that Flux optimiser is not being used (#260) (@ablaom)
Closed issues: