Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adolc tensor refactor #14205

Closed
wants to merge 6 commits into from
Closed

Conversation

drwells
Copy link
Member

@drwells drwells commented Aug 20, 2022

More progress on #13949 - now for adol-c instead of Sacado.

I tested this on my machine with and without advanced branching.

@drwells drwells added Automatic differentiation Issues and PRs related to symbolic and automatic differentiation ready to test labels Aug 20, 2022
@drwells drwells force-pushed the adolc-tensor-refactor branch 2 times, most recently from e7d8ebc to 8c8bfb4 Compare August 20, 2022 21:41
for (unsigned int i = 0; i < dim; ++i)
sum += std::fabs(t[i][j]);

condassign(max, (sum > max), sum, max);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@drwells For these taped types, one needs to use the condassign() function to switch between branches. An if statement won't work if the evaluation point changes which branch is taken.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll write a test so that I understand this better.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had the ADOL-C manual in hand when decomposing and writing these functions, so whenever you come across an unusual implementation then I'd suggest consulting the manual. The requirements of the taping algorithm often dictates how code with branches has to be written in order to be taped AD compatible. It would not surprise me if there is not a test for every single specialised function -- the implementation was probably "best effort".

@jppelteret
Copy link
Member

Thanks for working towards optimising all things related to the AD types, @drwells. I'll do my best to give you feedback for both this and your other PRs on the weekend.

@drwells
Copy link
Member Author

drwells commented Mar 25, 2023

These functions really need to be in the header for reasons stated above (condassign in a template function). Furthermore, we don't gain a lot by trying to excise adolc from this file: adolc itself, when installed, only has 6k total lines of header files. Hence lets close this.

@drwells drwells closed this Mar 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Automatic differentiation Issues and PRs related to symbolic and automatic differentiation ready to test
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants