Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bandits regressors for model selection (new PR to use Github CI/CD) #397

Merged
merged 30 commits into from Jan 4, 2021
Merged
Show file tree
Hide file tree
Changes from 27 commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
da7a823
[WIP] first layout for bandits classes
etiennekintzler Nov 15, 2020
13154a6
improve docstring, rm class not in PR, clean some
etiennekintzler Nov 22, 2020
b0fefc6
delete nb from pr, clean some
etiennekintzler Nov 22, 2020
e128e61
enumerate all parameters in __init__, fix \epsilon
etiennekintzler Nov 23, 2020
a50693f
align on convention: single quote and import
etiennekintzler Nov 23, 2020
22c7923
rm print_every, change print_info->__repr__, skip line after class
etiennekintzler Nov 23, 2020
afe1cac
substitute stdlib for numpy
etiennekintzler Nov 23, 2020
3c25c93
rm metrics tracing, add _learn_one for powerusers
etiennekintzler Nov 24, 2020
860e779
forget to rm object tracing in class __init__ signature
etiennekintzler Nov 24, 2020
9fe14f1
add type to models, _default_params, use '+=' for list append
etiennekintzler Nov 24, 2020
f0b1b6c
change parameters in _default_params
etiennekintzler Nov 25, 2020
9cb720c
intercept_lr instead of lr in LinearRegression
etiennekintzler Nov 25, 2020
d3169dc
mv argmax to utils.math
etiennekintzler Nov 25, 2020
c7b20ca
fix mistake: EpsilonGreedyRessor didnt inherit from base.Regressor
etiennekintzler Nov 25, 2020
bf1fcf5
Merge branch 'master' into bandits_regressors_PR
etiennekintzler Nov 26, 2020
35d37f8
add seed arg for reproducibility/tests
etiennekintzler Nov 29, 2020
0ba1cc1
add typing for seed parameter, rm seed from _default_params
etiennekintzler Dec 1, 2020
02097ae
first draft for docstring's example
etiennekintzler Dec 11, 2020
a954fb7
raw: sigmoid scaler, warm_up, mv explore_each_arm in Bandit,cut Examp…
etiennekintzler Dec 19, 2020
af32a24
Merge branch 'master' into bandits_regressors_PR
etiennekintzler Dec 19, 2020
7bea5bb
chg classmethod's name _default_params to _unit_test_params, fix star…
etiennekintzler Dec 19, 2020
513a982
run black, test commit hook
etiennekintzler Dec 21, 2020
929c035
more docstring; add randomize argmax, default value for metric; rm _n…
etiennekintzler Dec 24, 2020
99926a0
possibility to add function with seed in argmax
etiennekintzler Dec 27, 2020
69b0d11
fix docstring Example output
etiennekintzler Dec 27, 2020
8e6963a
small modif to docstring
etiennekintzler Dec 27, 2020
14420ef
Merge branch 'master' into bandits_regressors_PR
etiennekintzler Dec 27, 2020
8599730
shorten some lines
etiennekintzler Dec 27, 2020
0cb664e
docstring effort (not finished), rm c=1 parameter from _compute_scale…
etiennekintzler Dec 28, 2020
8d5c0a1
cosmetic
etiennekintzler Jan 4, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
4 changes: 4 additions & 0 deletions river/expert/__init__.py
Expand Up @@ -16,15 +16,19 @@

"""

from .bandit import EpsilonGreedyRegressor
from .bandit import UCBRegressor
from .ewa import EWARegressor
from .sh import SuccessiveHalvingClassifier
from .sh import SuccessiveHalvingRegressor
from .stacking import StackingClassifier


__all__ = [
"EpsilonGreedyRegressor",
"EWARegressor",
"SuccessiveHalvingClassifier",
"SuccessiveHalvingRegressor",
"StackingClassifier",
"UCBRegressor",
]