-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
baseline reduction: separate learning of additive regression baseline #1336
Conversation
The windows barf here: https://ci.appveyor.com/project/JohnLangford/vowpal-wabbit/build/1.0.2255#L2307 is presumably because the windows build doesn't include the new file. |
vowpalwabbit/baseline.cc
Outdated
void predict_or_learn(baseline& data, base_learner& base, example& ec) | ||
{ if (is_learn) | ||
{ // do a full prediction, for safety in accurate predictive validation | ||
base.predict(ec); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can factor base.predict() out of the if/else for simplicity.
This looks good to go other than the minor refactoring and fixing the windows build about here: https://github.com/JohnLangford/vowpal_wabbit/blob/master/vowpalwabbit/vw_dynamic.vcxproj#L436 . Can you tweak? |
Some comments:
|
Merged in, thanks. |
This reduction allows a regression learner to separately learn an additive baseline prediction from only "constant" features (taken from the constant_namespace), and the residual on top of that. This seems to make it faster to learn a possibly large constant offset in practice.
cc @JohnLangford