Skip to content

v0.12.0

Choose a tag to compare

@github-actions github-actions released this 29 Jun 22:08
· 1201 commits to master since this release
d32e8b8

MLJ v0.12.0

Diff since v0.11.5

This release provides updates to breaking changes in MLJBase 0.14.0, and MLJModels 0.11.0 and MLJTuning 0.4.0. For a complete list of changes, closed issues and pull requests, refer to the linked release notes.

It also updates the MLJ documentation to reflect the new changes.

Summary

Main breaking changes:

  • Adds restrictions to acceleration options - nesting
    distributed processes within multithread process is now disallowed.

  • Adds more user-friendly interface for inspecting training reports and
    fitted parameters of composite models. For example, if composite = @pipeline OneHotEncoder KNNRegressor and mach = machine(composite, X, y), then access the fitted parameters of the machine associated
    with KNNRegressor using fitted_params(mach).knn_regressor.

  • The @from_network syntax has been changed to make it more
    expressive. In particular, through the new concept of learning
    network machines
    it is possible to export a learning network to a
    composite type supporting multiple operations (e.g., predict and
    transform, as in clustering). See the
    manual

    for details. The old syntax is no longer supported.

Other enhancements of note:

  • Adds MLJFlux
    models to the registry for incorporating neural network models.

  • A more economic @pipeline syntax has been introduced. For example,
    pipe = @pipeline OneHotEncoder PCA(maxoutdim=3) defines model
    pipe with automatically generated field names and model type name. Target inverse
    transformations now ooccur immediately after the supervised model in
    a @pipeline, instead of at the end, unless invert_last=true. The old syntax is
    available but deprecated.

  • It is now possible to simulataneously load model code for models
    having the same name but from different packages, when using @load
    or load.

  • Removes the requirement to specify the kind of source node, as in
    source(y, kind=:target). The role of source nodes in learning
    networks is now inferred from the order in which they appear in
    learning network machine constructors (see above).

Deprecations:

  • The old @pipeline syntax.

  • The specification of kind when constructing a source node.

  • The use of fitresults() when exporting learning networks "by
    hand". See the manual for the new way to do this.

Closed issues:

  • Integrate flux models (#33)
  • Conflict w Scikitlearn.jl (#502)
  • Accessing nested machines is (still) awkward (#553)
  • Add note at MLJ landing page that MLJ wraps a majority of sk-learn models (#573)
  • OpenML integration: Kmeans is not fitting (#580)

Merged pull requests: