Skip to content

Latest commit

 

History

History
98 lines (81 loc) · 4.52 KB

ORGANIZATION.md

File metadata and controls

98 lines (81 loc) · 4.52 KB

Code Organization

Dependency chart for MLJ repositories. Repositories with dashed connections do not currently exist but are planned/proposed.

Repositories of some possible interest outside of MLJ, or beyond its conventional use, are marked with a ⟂ symbol:

  • MLJ.jl is the general user's point-of-entry for choosing, loading, composing, evaluating and tuning machine learning models. It pulls in most code from other repositories described below. (A current exception is homogeneous ensembles code, to be migrated to MLJBase or its own repository MLJEnsembles.) MLJ also hosts the MLJ manual which documents functionality across the repositories, with the exception of ScientificTypes, and MLJScientific types which host their own documentation. (The MLJ manual and MLJTutorials do provide overviews of scientific types.)

  • MLJModelInterface.jl is a lightweight package imported by packages implementing MLJ's interface for their machine learning models. It's sole dependency is ScientificTypes, which is a tiny package with no dependencies.

  • (⟂) MLJBase.jl is a large repository with two main purposes: (i) to give "dummy" methods defined in MLJModelInterface their intended functionality (which depends on third party packages, such as Tables.jl, Distributions.jl and CategoricalArrays.jl); and (ii) provide functionality essential to the MLJ user that has not been relegated to its own "satellite" repository for some reason. See the MLJBase.jl readme for a detailed description of MLJBase's contents.

  • MLJModels.jl hosts the MLJ registry, which contains metadata on all the models the MLJ user can search and load from MLJ. Moreover, it provides the functionality for loading model code from MLJ on demand. Finally, it furnishes some commonly used transformers for data pre-processing, such as ContinuousEncoder and Standardizer.

  • MLJTuning.jl provides MLJ's TunedModel wrapper for hyper-parameter optimization, including the extendable API for tuning strategies, and selected in-house implementations, such as Grid and RandomSearch.

  • MLJIteration.jl provides the IteratedModel wrapper for controlling iterative models (snapshots, early stopping criteria, etc)

  • MLJSerialization.jl provides functionality for saving MLJ machines to file

  • MLJOpenML.jl provides integration with the OpenML data science exchange platform

  • (⟂) MLJLinearModels.jl is an experimental package for a wide range of julia-native penalized linear models such as Lasso, Elastic-Net, Robust regression, LAD regression, etc.

  • MLJFlux.jl an experimental package for using neural-network models, built with Flux.jl, in MLJ.

  • (⟂) ScientificTypes.jl is an ultra lightweight package providing "scientific" types, such as Continuous, OrderedFactor, Image and Table. It's purpose is to formalize conventions around the scientific interpretation of ordinary machine types, such as Float32 and DataFrame.

  • (⟂) MLJScientificTypes.jl articulates MLJ's own convention for the scientific interpretation of data.

  • (⟂) StatisticalTraits.jl An ultra lightweight package defining fall-back implementations for a collection of traits possessed by statistical objects.

  • (⟂) DataScienceTutorials collects tutorials on how to use MLJ, which are deployed here