Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tensor polynomials: Remove nonsensical comments #15915

Merged
merged 1 commit into from Aug 23, 2023

Conversation

kronbichler
Copy link
Member

@kronbichler kronbichler commented Aug 22, 2023

Looking at #15913, I realized that all classes derived from TensorPolynomialsBase refer to some compute_value, compute_grad, compute_grad_grad functions that are not even present in these classes. I guess this is a copy-paste effect from the scalar tensor product polynomial class

* If you need values or derivatives of all tensor product polynomials then
* use this function, rather than using any of the compute_value(),
* compute_grad() or compute_grad_grad() functions, see below, in a loop
* over all tensor product polynomials.
that actually has these functions
double
compute_value(const unsigned int i, const Point<dim> &p) const override;
etc. While I would be tempted to introduce such a function to evaluate a single basis function at a point at least for the class PolynomialsRaviartThomas to get the matrix-free initialization reasonably fast (the current method has a quadratic complexity in ShapeInfo::reinit() because it evaluates the complete basis for every basis function we request via FE_PolyTensor::shape_value_component()), that is a wrong assumption, and it is in fact better to work towards #9655.

@bangerth bangerth merged commit 40a6383 into dealii:master Aug 23, 2023
11 of 15 checks passed
@kronbichler kronbichler deleted the remove_outdated_documentation branch August 29, 2023 17:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants