-
Notifications
You must be signed in to change notification settings - Fork 63
Correct transformation of second-order tensors in adapters #304
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
For non-linear transformations, I see little hope that we can make it work generally for point estimates. There are some combinations of point estimates and transformations that work such as quantiles and strictly monotonic transforms. But I think that, for now, we should warn somewhere when a point estimate is passed through a non-linear transformation and recommend to just not use these transformation with pointInferenceNetworks. Standarize is a bit of a different situation, both because it is linear and because it is so important for the networks to work in a stable manner in lots of situtations. So mean and quantiles can just be back transformed through standardize and all is well. Now, for a covariance matrix the situation is different as you highlight above. Same actually for only the variance since we need to square the scaling. I suggest something like the following solution to this general problem (covariance matrix is even harder than this): Every point estimation loss implemented natively by us (e.g. squared loss) knows which point estimate it produces (e.g., mean). So we can enrich it by a structure telling us what to do with inverese transforms. This is turn could be a dict of 3 options.
This approach of course needs a bit of work for each point estimation loss but I believe it can be done in a structured and tidy way such that the maintainence effort of this feature would be small. What do you think @han-ol and @stefanradev93 ? |
Until we resolve this issue fully in the future, changes introduced in PR #380 warn the user when As it stands, the warning is raised even if the adapter does not contain a problematic transformation. To be more specific in raising a warning, we would need a reliable way to determine, whether (a part of) a variable that is eventually renamed to "inference_variables" is standardized at some point. |
@han-ol do you think we can close this issue as resolved for now? Both Stefan and I are fine with the warning for now as it is. |
Currently, only first-order tensors, a.k.a. vectors can be transformed in both forward and inverse direction through an adapter object.
Can we transform also different orders, particularly second-order?
The question arises because a
PointInferenceNetwork
can estimate the covariance matrix of theinference_variables
.Some adapters represent a change of basis and origin, for example,
standardize
is a linear coordinate transformation from "original coordinates" to "standardized coordinates".Thus, the inference_variables live in the standardized coordinates, and the covariance matrix needs to be adapted in inverse direction to relate to unstandardized coordinates.
When a covariance matrix is estimated in the standardized coordinates, the inverse of the coordinates transformation is different from naively transforming matrix columns as if they were vectors in standardized coordinates. transforms as an order 2 tensor: the basis change matrix is multiplied from both sides . with the standard deviations for each dimension, so .
Rather, a covariance matrix
For
standardize
,Solutions to this issue are likely related to a separate issue on keeping track of jacobians of adapter transforms, as mentioned in #245.
The text was updated successfully, but these errors were encountered: