You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
sigma_ : array-like of shape (n_features, n_features)
However if parameters are pruned from the model the sigma_ matrix will shrink, for example if 10 parameters are pruned then sigma_ will be of shape (n_features - 10, n_features - 10).
This could easily be fixed in the documentation, but it would of course be nice to have access to the full sigma_ matrix.
Hi, i'm new here and i would like to contribute to this issue, i just want to fully understand you,
can you please explain more about what do you mean by saying "to have access to the full sigma_ matrix."
Thank you
Hi,
Yes, in my linear model I have 50 parameters (weights) and sigma_ is the covariance matrix for these parameters. So I would expect sigma_[i][j] to be the covariance between parameter i and parameter j, and hence sigma_ is of shape (50, 50).
However if ARDR prunes 10 parameters the size/shape of sigma_ shrinks to (40, 40).
So if I now want to find the covariance between parameters 32, 48 I don't know how to do it. I'm guessing I would have to re-index them such that e.g. 32 --> 25 and 48 --> 39 based on which parameters were pruned.
What I mean with full sigma_ matrix was just an idea if it would be possible to keep the size (50,50) and add zeros for pruned parameters, such that no re-indexing is necessary.
In the documentation for the ARDRegressor (https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.ARDRegression.html), it says that the shape of
sigma_
isHowever if parameters are pruned from the model the
sigma_
matrix will shrink, for example if 10 parameters are pruned thensigma_
will be of shape(n_features - 10, n_features - 10)
.This could easily be fixed in the documentation, but it would of course be nice to have access to the full
sigma_
matrix.Short example to demonstrate the behavior
with output
The text was updated successfully, but these errors were encountered: