Skip to content

v0.3.0

Compare
Choose a tag to compare
@julia-tagbot julia-tagbot released this 23 Aug 11:23
v0.3.0
b64d2b0

v0.3.0 (2019-08-21)

Diff since v0.2.5

  • Introduction of traits for measures (loss functions, etc); see top
    of /src/measures.jl for definitions. This

    • allows user to use loss functions from LossFunctions.jl,
    • enables improved measure checks and error message reporting with measures
    • allows evaluate! to report per-observation measures when
      available (for later use by Bayesian optimisers, for example)
    • allows support for sample-weighted measures playing nicely
      with rest of API
  • Improvements to resampling:

    • evaluate! method now reports per-observation measures when
      available
    • sample weights can be passed to evaluate! for use by measures
      that support weights
    • user can pass a list of train/evaluation pairs of row indices
      directly to evaluate!, in place of a ResamplingStrategy
      object
    • implementing a new ResamplingStrategy is now straightforward (see docs)
    • one can call evaluate (no exclamation mark) directly on
      model + data without first constructing a machine, if desired
  • Doc strings and the
    manual have
    been revised and updated. The manual includes a new section "Tuning
    models", and extra material under "Learning networks" explaining how
    to export learning networks as stand-alone models using the
    @from_network macro.

  • Improved checks and error-reporting for binding models to data in
    machines.

  • (Breaking) CSV is now an optional dependency, which means you now
    need to import CSV before you can load tasks with load_boston(),
    load_iris(), load_crabs(), load_ames(), load_reduced_ames()

  • Added schema method for tables (re-exported from
    ScientificTypes.jl). Returns a named tuple with keys :names,
    :types, :scitypes and :nrows.

  • (Breaking) Eliminate scitypes method. The scientific types of a
    table are returned as part of ScientificTypes schema method (see
    above)

Closed issues:

  • Migrate @load macro to MLJBase.jl? (#208)
  • Loss functions in MLJ (#205)
  • Missing package dependency? (#204)
  • Test for MLJModels/Clustering.jl gives warning "implicit dims=2... (#202)
  • TunedModel objects not displaying correctly (#197)
  • DecisionTreeRegressor fails to predict (#193)
  • Control verbosity of @load macro (#192)
  • How to know which models are regression models? (#191)
  • Error loading the package (#190)
  • Data science and ML ontologies in MLJ (#189)
  • Machine fit! from the model not working (#187)
  • How can I extract a fitresult that does not contain any of the original data? (#186)
  • @from_network not working if MLJBase not in load path. (#184)
  • Improve nested parameter specification in tuning (#180)
  • Resampling strategies should have option for independent RNG (#178)
  • Fitting SVC machine changes hyperparameters (#172)
  • range(SVC, :gamma, ...) returns NominalRange instead of NumericRange (#170)
  • Local support at the ATI? (#169)
  • range(tree, :n, ...) not working for tree=DecisionTreeRegressorClassfier (#168)
  • Add some transformers from MultivariateStats.jl (#167)
  • Use of MLJRegistry (#165)
  • No method matching build_tree (#164)
  • Multiple learning curves just repeating the first curve (#163)
  • Register v0.2.5 (#162)
  • Register v0.2.4 (#160)
  • Issue with Documentation/Example - DecisionTreeClassifier again... (#156)
  • Convert example/xgboost.jl into notebook (#148)
  • GSoC Project Proposal (#78)
  • Implement MLJ interface for linear models (#35)

Merged pull requests: