-
Notifications
You must be signed in to change notification settings - Fork 635
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add eta (shrinkage parameter) to xgbLinear #372
Comments
Also, both lambda and alpha are labeled L2 Regularization, but I think one of them (I can't remember which? Maybe lambda?) I think should be L1? |
I just looked it up, and alpha is L1 regularization. There is also a lambda_bias term for L2 regularization on bias, but I've never used it. |
I'll make the change to the labels. What do you suggest for a candidate range for |
Most of what I've seen is eta is between 0.05 and 0.3, but other xgboosters may have a different opinion. |
It will be in the next CRAN version |
For some reason, the xgbLinear method seems to set the eta parameter to 0.3 for all training situations. This should be a parameter that varies. I was able to do it using the custom code provided before xgbLinear became officially part of caret, but I think you should be able to change eta in the official code.
To wit:
The text was updated successfully, but these errors were encountered: