You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm a little confused with this following line. K has dimensions (target_dim * n * n) where n is the number of training points.
line 71 python L = tf.cholesky(K + self.noise[:, None, None]*batched_eye)
In the reference paper and implementation, they do moment matching for every target dimension.
for i=1:E % compute K and inv(K) inp = bsxfun(@rdivide,gpmodel.inputs,exp(X(1:D,i)')); K(:,:,i) = exp(2*X(D+1,i)-maha(inp,inp)/2); if isfield(gpmodel,'nigp') L = chol(K(:,:,i) + exp(2*X(D+2,i))*eye(n) + diag(gpmodel.nigp(:,i)))'; else L = chol(K(:,:,i) + exp(2*X(D+2,i))*eye(n))'; end iK(:,:,i) = L'\(L\eye(n)); beta(:,i) = L'\(L\gpmodel.targets(:,i)); end
Is finding the inverse of K + noise for every target dimension the same a stacking the inverse of every target dimension computed inside the for loop? I know you tested the code but I'm still confused and couldn't find any information regarding the tensorflow cholesky decomposition and solution for tensors with more than "3 dimension".
The same question applies for the rest of the moment matching implementation.
Regards
The text was updated successfully, but these errors were encountered:
I believe that the documentation of tf.linalg.choleksy answers your question. Calling the cholesky on a "n-dimensional" tensor is simply computing the cholesky decompositions of all the 2-dimensional tensors (i.e. matrices) [..., :, :] contained in the original tensor.
I'm a little confused with this following line. K has dimensions (target_dim * n * n) where n is the number of training points.
line 71 python
L = tf.cholesky(K + self.noise[:, None, None]*batched_eye)
In the reference paper and implementation, they do moment matching for every target dimension.
line 50 matlab
for i=1:E % compute K and inv(K)
inp = bsxfun(@rdivide,gpmodel.inputs,exp(X(1:D,i)'));
K(:,:,i) = exp(2*X(D+1,i)-maha(inp,inp)/2);
if isfield(gpmodel,'nigp')
L = chol(K(:,:,i) + exp(2*X(D+2,i))*eye(n) + diag(gpmodel.nigp(:,i)))';
else
L = chol(K(:,:,i) + exp(2*X(D+2,i))*eye(n))';
end
iK(:,:,i) = L'\(L\eye(n));
beta(:,i) = L'\(L\gpmodel.targets(:,i));
end
Is finding the inverse of K + noise for every target dimension the same a stacking the inverse of every target dimension computed inside the for loop? I know you tested the code but I'm still confused and couldn't find any information regarding the tensorflow cholesky decomposition and solution for tensors with more than "3 dimension".
The same question applies for the rest of the moment matching implementation.
Regards
The text was updated successfully, but these errors were encountered: