-
Notifications
You must be signed in to change notification settings - Fork 156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wrap Scikitlearn.jl Elastic Net algorithms #112
Comments
Some extra details: ElasticNetCV
|
Plain ElasticNet
|
Just out of curiosity.. Why wrap scikit-learn when Julia already has a wrapper around the original Fortran code (https://github.com/JuliaStats/GLMNet.jl) as well as a native Julia implementation (https://github.com/JuliaStats/Lasso.jl)? |
Yes, I am aware of Lasso.jl, having implemented Koala's interface for it. We doing the sklearn because it looked quick and easy for @ysimillides, who has already done some sk wraps, and (I am busy with other things). It also struck me that Lasso.jl needed a little TLC, although maybe that's changed? If you were interested in doing implementations for GLMNet or Lasso, I'm very happy to provide guidance. |
done |
sklearn.linear_model.ElasticNet : this will be
ElasticNet <: Deterministic{Any}
,target_type(::ElasticNet) = MLJBase.Continuous
,input_types(::ElasticNet) = MLJBase.Continous
sklearn.linear_model.ElasticNetCV: this will be
ElasticNetCV <: Deterministic{Any}
, with same target_type and input_types as above. Note the sklearn model hasverbose
as a hyperparameter, which we will drop from the MLJ model. The MLJfit
will pass either 0 or 1 to the sklearn fit, according to whether the MLJ fitverbosity
is <= 0 or > 0.The text was updated successfully, but these errors were encountered: