v0.3.0
v0.3.0 (2019-08-21)
-
Introduction of traits for measures (loss functions, etc); see top
of /src/measures.jl for definitions. This- allows user to use loss functions from LossFunctions.jl,
- enables improved measure checks and error message reporting with measures
- allows
evaluate!
to report per-observation measures when
available (for later use by Bayesian optimisers, for example) - allows support for sample-weighted measures playing nicely
with rest of API
-
Improvements to resampling:
evaluate!
method now reports per-observation measures when
available- sample weights can be passed to
evaluate!
for use by measures
that support weights - user can pass a list of train/evaluation pairs of row indices
directly toevaluate!
, in place of aResamplingStrategy
object - implementing a new
ResamplingStrategy
is now straightforward (see docs) - one can call
evaluate
(no exclamation mark) directly on
model + data without first constructing a machine, if desired
-
Doc strings and the
manual have
been revised and updated. The manual includes a new section "Tuning
models", and extra material under "Learning networks" explaining how
to export learning networks as stand-alone models using the
@from_network
macro. -
Improved checks and error-reporting for binding models to data in
machines. -
(Breaking) CSV is now an optional dependency, which means you now
need to import CSV before you can load tasks withload_boston()
,
load_iris()
,load_crabs()
,load_ames()
,load_reduced_ames()
-
Added
schema
method for tables (re-exported from
ScientificTypes.jl). Returns a named tuple with keys:names
,
:types
,:scitypes
and:nrows
. -
(Breaking) Eliminate
scitypes
method. The scientific types of a
table are returned as part of ScientificTypesschema
method (see
above)
Closed issues:
- Migrate @load macro to MLJBase.jl? (#208)
- Loss functions in MLJ (#205)
- Missing package dependency? (#204)
- Test for MLJModels/Clustering.jl gives warning "implicit
dims=2
... (#202) - TunedModel objects not displaying correctly (#197)
- DecisionTreeRegressor fails to predict (#193)
- Control verbosity of @load macro (#192)
- How to know which models are regression models? (#191)
- Error loading the package (#190)
- Data science and ML ontologies in MLJ (#189)
- Machine
fit!
from the model not working (#187) - How can I extract a
fitresult
that does not contain any of the original data? (#186) - @from_network not working if MLJBase not in load path. (#184)
- Improve nested parameter specification in tuning (#180)
- Resampling strategies should have option for independent RNG (#178)
- Fitting SVC machine changes hyperparameters (#172)
- range(SVC, :gamma, ...) returns NominalRange instead of NumericRange (#170)
- Local support at the ATI? (#169)
- range(tree, :n, ...) not working for tree=DecisionTreeRegressorClassfier (#168)
- Add some transformers from MultivariateStats.jl (#167)
- Use of MLJRegistry (#165)
- No method matching build_tree (#164)
- Multiple learning curves just repeating the first curve (#163)
- Register v0.2.5 (#162)
- Register v0.2.4 (#160)
- Issue with Documentation/Example - DecisionTreeClassifier again... (#156)
- Convert example/xgboost.jl into notebook (#148)
- GSoC Project Proposal (#78)
- Implement MLJ interface for linear models (#35)
Merged pull requests:
- Update to MLJBase 0.4.0 (#212) (ablaom)
- Improve resampling/evaluation; add measures API (incl. LossFunctions) (#206) (ablaom)
- Fix link to ipynb tour (#203) (Kryohi)
- Fix197b (#201) (tlienart)
- Fix #192: Add verbosity option to @load macro (#196) (juliohm)
- Update bug_report.md (#194) (juliohm)
- Make CSV a test dependency (#185) (DilumAluthge)
- Minor fixes + docstrings (#183) (tlienart)
- typo fix in learning networks docs (#182) (tlienart)
- Add rng to resampling methods (cv and holdout) (#179) (ayush-1506)
- Change get_type implementation (#171) (oleskiewicz)