Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid a couple FP subtractions. #8574

Merged
merged 1 commit into from
Aug 20, 2019
Merged

Commits on Aug 19, 2019

  1. Avoid a couple FP subtractions.

    By noting that the existing code performs dim subtractions of
    terms that are each a product of two values, we can reorder
    things in such a way that we first accumulate the products
    (which is a dot product) and then subtract the result. This
    should allow for some vectorization.
    
    The performance gain is almost certainly completely negligible,
    but it makes the code marginally easier to read. The reason
    why the indices involved here allow for this is because
    'jacobian_pushed_forward_grads[i]' happens to be a
    Tensor<3,dim> and 'shape_gradients[k][i]' is a
    Tensor<1,dim>. So the types are so that their product
    is in fact equivalent to the summation of the last index
    as was written before.
    bangerth committed Aug 19, 2019
    Configuration menu
    Copy the full SHA
    b0edf30 View commit details
    Browse the repository at this point in the history