-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
lars cookbook #3248
Merged
Merged
lars cookbook #3248
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
51 changes: 51 additions & 0 deletions
51
doc/cookbook/source/examples/regression/least_angle_regression.rst
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
======================= | ||
Least Angle Regression | ||
======================= | ||
|
||
Least Angle Regression (LARS) is an algorithm used to fit a linear regression model. LARS is simliar to forward stagewise regression but less greedy. Instead of including variables at each step, the estimated parameters are increased in a direction equiangular to each one's correlations with the residual. LARS can be used to solve LASSO, which is L1-regularized least square regression. | ||
|
||
.. math:: | ||
\min \|X^T\beta - y\|^2 + \lambda\|\beta\|_{1}] | ||
|
||
\|\beta\|_1 = \sum_i|\beta_i| | ||
|
||
where :math:`X` is the feature matrix with explanatory features and :math:`y` is the dependent variable to be predicted. | ||
Pre-processing of :math:`X` and :math:`y` are needed to ensure the correctness of this algorithm: | ||
:math:`X` needs to be normalized: each feature should have zero-mean and unit-norm, | ||
:math:`y` needs to be centered: its mean should be zero. | ||
|
||
|
||
------- | ||
Example | ||
------- | ||
|
||
Imagine we have files with training and test data. We create `CDenseFeatures` (here 64 bit floats aka RealFeatures) and :sgclass:`CRegressionLabels` as | ||
|
||
.. sgexample:: least_angle_regression.sg:create_features | ||
|
||
To normalize and center the features, we create an instance of preprocessors :sgclass:`CPruneVarSubMean` and :sgclass:`CNormOne` and apply it on the feature matrices. | ||
|
||
.. sgexample:: least_angle_regression:preprocess_features | ||
|
||
We create an instance of :sgclass:`CLeastAngleRegression` by selecting to disable the LASSO solution, setting the penalty :math:`\lambda` for l1 norm and setting training data and labels. | ||
|
||
.. sgexample:: least_angle_regression:create_instance | ||
|
||
Then we train the regression model and apply it to test data to get the predicted :sgclass:`CRegressionLabels` . | ||
|
||
.. sgexample:: linear_ridge_regression.sg:train_and_apply | ||
|
||
After training, we can extract :math:`{\bf w}`. | ||
|
||
.. sgexample:: linear_ridge_regression.sg:extract_w | ||
|
||
Finally, we can evaluate the :sgclass:`CMeanSquaredError`. | ||
|
||
.. sgexample:: linear_ridge_regression.sg:evaluate_error | ||
|
||
---------- | ||
References | ||
---------- | ||
:wiki:`Least-angle_regression` | ||
|
||
:wiki:`Stepwise_regression` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,47 @@ | ||
CSVFile f_feats_train("../../data/regression_1d_linear_features_train.dat") | ||
CSVFile f_feats_test("../../data/regression_1d_linear_features_test.dat") | ||
CSVFile f_labels_train("../../data/regression_1d_linear_labels_train.dat") | ||
CSVFile f_labels_test("../../data/regression_1d_linear_labels_test.dat") | ||
|
||
#![create_features] | ||
RealFeatures features_train(f_feats_train) | ||
RealFeatures features_test(f_feats_test) | ||
RegressionLabels labels_train(f_labels_train) | ||
RegressionLabels labels_test(f_labels_test) | ||
#![create_features] | ||
|
||
#![preprocess_features] | ||
PruneVarSubMean SubMean() | ||
NormOne Normalize() | ||
SubMean.init(features_train) | ||
SubMean.apply_to_feature_matrix(features_train) | ||
SubMean.apply_to_feature_matrix(features_test) | ||
Normalize.init(features_train) | ||
Normalize.apply_to_feature_matrix(features_train) | ||
Normalize.apply_to_feature_matrix(features_test) | ||
#![preprocess_features] | ||
|
||
#![create_instance] | ||
real lamda1 = 0.01 | ||
LeastAngleRegression lars(False) | ||
lars.set_features(features_train) | ||
lars.set_labels(labels_train) | ||
lars.set_max_l1_norm(lamda1) | ||
#![create_instance] | ||
|
||
#![train_and_apply] | ||
lars.train() | ||
RegressionLabels labels_predict = lars.apply_regression(features_test) | ||
|
||
#[!extract_w] | ||
RealVector weights = lars.get_w() | ||
#[!extract_w] | ||
|
||
#![evaluate_error] | ||
MeanSquaredError eval() | ||
real mse = eval.evaluate(labels_predict, labels_test) | ||
#![evaluate_error] | ||
|
||
# integration testing variables | ||
RealVector output = labels_test.get_labels() | ||
|
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we also want the regularisation path extraction
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Its a std vector of vectors right now, can be copied into and sgmatrix and extracted. Will do that.