Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CORE] New compilation options #11790

Merged
merged 42 commits into from
Mar 18, 2024
Merged

[CORE] New compilation options #11790

merged 42 commits into from
Mar 18, 2024

Conversation

roigcarlo
Copy link
Member

@roigcarlo roigcarlo commented Nov 10, 2023

📝 Description

This adds some new options for Kratos compilation and changes how the CI performs the compilation. Some of these set of changes should ease the process to convert Kratos and apps to discoverable CMake "modules" (@mpentek).

This also allows to fix the disk problems in the CI/FullDebug

List of new compilation options:

  • EXCLUDE_KRATOS_CORE=[ON/OFF]: If set to ON allows the core to be excluded from a compilation. Default OFF
  • EXCLUDE_AUTOMATIC_DEPENDENCIES=[ON/OFF]: If set to ON allows cmake to ignore the application dependencies specified in the cmake file of the applications. Default OFF

List of changes made to the cmake files:

  • Kratos core can now be included as a pre-compiled library which will be searched in the default install directory.
  • Trilinos can now be find even if trilinos application is not selected
  • MappingMPIExtension was renamed to MeshTrilinosExtension, as was mainly as it relies on the EPETRA_EV vector. Note that with this change CoSimApplication is the only a application with a pure non trilinos mpi extension
  • Added explicit include trilinos dir to the extension projects
  • Rom Application now correctly adds includes from Eigen/MKL/LinearSolvers

List of changes in the CI:

  • Compilation has been layered into stages. At the end of each stage, C++ artifacts are cleaned. With the new compilation options is possible to maintain the libraries
    • First stage compiles the core and direct core components (Linear Solvers, Metis and Trilinos)
    • Second stage compiles utilities ( No physics apps like: Mapping, Meshing, HDF5, MED, etc... )
    • Third stage compiles core applications ( List is not final )
    • Fourth stage compiles the rest.

@loumalouomega
Copy link
Member

📝 Description Live testing for the new compilation options for linux

@loumalouomega Do not approve or merge this, its only for testing against the real CI.

It is not fair :(

@roigcarlo
Copy link
Member Author

Close as I finally managed to test the CI locally :D

@roigcarlo roigcarlo closed this Nov 17, 2023
@roigcarlo roigcarlo reopened this Feb 29, 2024
@roigcarlo roigcarlo marked this pull request as ready for review March 5, 2024 13:13
@roigcarlo roigcarlo requested review from a team as code owners March 5, 2024 13:13
@roigcarlo
Copy link
Member Author

@ddiezrod @aaunnam, if you want, you can keep using the old mechanism in your CI, if you don't enable EXCLUDE_KRATOS_CORE=ON and EXCLUDE_AUTOMATIC_DEPENDENCIES=ON there should be no change (same for the user scripts)

@loumalouomega
Copy link
Member

pinging @aaunnam (@KratosMultiphysics/altair devops) just for you to be aware this, We might need to check this works in our CI

I modified your comment because you were mentioning a random guy which username is altair

@roigcarlo
Copy link
Member Author

ouch sorry

@roigcarlo roigcarlo requested a review from a team as a code owner March 8, 2024 09:30
@roigcarlo
Copy link
Member Author

@rfaasse Seems that one of the problems with the nightly is due to the UPwSmallStrainInterfaceElement.

i don't want to merge this without having fixed all pipelines so I added a change in the GeoMechancisApp as well

As far as I can see, this problems comes because InterpolateOutputValues is templated but not defined and there is no explicit instantiation of the container type, so depending on the instantiation order while compiling the files that use it may provoke it to be unreferenced.

I've fixed it explicitly instantiating:

template void UPwSmallStrainInterfaceElement<2, 4>::InterpolateOutputValues<array_1d<double, 3>>(std::vector<array_1d<double, 3>>& rOutput, const std::vector<array_1d<double, 3>>& GPValues);
template void UPwSmallStrainInterfaceElement<3, 6>::InterpolateOutputValues<array_1d<double, 3>>(std::vector<array_1d<double, 3>>& rOutput, const std::vector<array_1d<double, 3>>& GPValues);
template void UPwSmallStrainInterfaceElement<3, 8>::InterpolateOutputValues<array_1d<double, 3>>(std::vector<array_1d<double, 3>>& rOutput, const std::vector<array_1d<double, 3>>& GPValues);

template void UPwSmallStrainInterfaceElement<2, 4>::InterpolateOutputValues<Matrix>(std::vector<Matrix>& rOutput, const std::vector<Matrix>& GPValues);
template void UPwSmallStrainInterfaceElement<3, 6>::InterpolateOutputValues<Matrix>(std::vector<Matrix>& rOutput, const std::vector<Matrix>& GPValues);
template void UPwSmallStrainInterfaceElement<3, 8>::InterpolateOutputValues<Matrix>(std::vector<Matrix>& rOutput, const std::vector<Matrix>& GPValues);

Is that ok to you?

@roigcarlo
Copy link
Member Author

@philbucher Its alive! finally.

@ddiezrod If you give me green light from altair and there is no other problem we can merge 👍

@ddiezrod
Copy link
Contributor

ddiezrod commented Mar 8, 2024

@aaunnam Can you please check if our CI passes with these changes?

@philbucher
Copy link
Member

cool!
I will take a close look, I would appreciate if you can hold of with merging

@roigcarlo
Copy link
Member Author

cool! I will take a close look, I would appreciate if you can hold of with merging

sure np

@roigcarlo
Copy link
Member Author

roigcarlo commented Mar 13, 2024

@philbucher, @aaunnam did you have time to look over the changes?

@loumalouomega
Copy link
Member

@philbucher, @aaunnam did you have time to look over the changes?

A priori does not break our CI

Copy link
Member

@philbucher philbucher left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only preliminary comments, still need a bit more time

.github/workflows/ci-dummy.yml Outdated Show resolved Hide resolved
.github/workflows/ci.yml Show resolved Hide resolved
.github/workflows/ci.yml Show resolved Hide resolved
Copy link
Member

@philbucher philbucher left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice! Great Job
Only one small correction required

I dont like the code duplication that is required (gitlab ci is soooo much nicer for this) but thats not for this PR. Unless you have an idea/solution for it

Comment on lines +23 to +25
add_app ${KRATOS_APP_DIR}/LinearSolversApplication;
add_app ${KRATOS_APP_DIR}/MetisApplication;
add_app ${KRATOS_APP_DIR}/TrilinosApplication;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldnt this be in dependencies?
configure_core.sh is a bit misleading otherwise

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes.. I agree, but for reasons I need those three to be compiled in the first stage no matter what, otherwise the list of dependencies and when to activate them becomes a mess. I may change the name of the script to relfect that :S

(bascially moving trilinos and linear solvers retriggers the compilation of the core no matter what, so ....)

.github/workflows/ci.yml Outdated Show resolved Hide resolved
Copy link
Member

@philbucher philbucher left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets gooo
Clap 👏🏻

Btw is this mainly to solve the space issue?
Is it faster or slower now?

And will you also do this for the other apps?

@roigcarlo
Copy link
Member Author

It solves the space issue, no change in speed I am afraid, but since we use 4 cores by default maybe its faster.

As for other apps, we can split further if we run out of space again, we can even create a single job for every app and compile selectively is that's what you are asking :)

@roigcarlo roigcarlo changed the title [CORE] [Do not merge] Testing new compilation options [CORE] New compilation options Mar 18, 2024
@roigcarlo roigcarlo merged commit 3052722 into master Mar 18, 2024
15 checks passed
@roigcarlo roigcarlo deleted the core/new-compile-pipeline branch March 18, 2024 10:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants