Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
By noting that the existing code performs dim subtractions of terms that are each a product of two values, we can reorder things in such a way that we first accumulate the products (which is a dot product) and then subtract the result. This should allow for some vectorization. The performance gain is almost certainly completely negligible, but it makes the code marginally easier to read. The reason why the indices involved here allow for this is because 'jacobian_pushed_forward_grads[i]' happens to be a Tensor<3,dim> and 'shape_gradients[k][i]' is a Tensor<1,dim>. So the types are so that their product is in fact equivalent to the summation of the last index as was written before.
- Loading branch information