New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update Uberenv and introduce Gitlab CI [feature/uberenv_ci] #502
Conversation
scripts/uberenv/project.json
Outdated
"package_final_phase" : "configure", | ||
"package_source_dir" : "../..", | ||
"spack_url": "https://github.com/spack/spack", | ||
"spack_branch": "develop", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i know we need to use a newer spack, but I am a bit worried about doing this without testing in a few places (NERSC)
I would also want it to be pinned to some version to avoid random failures when spack develop moves on.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can create a fork in the alpine space that points to a newer spack
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am testing with v0.13.4 tag. In addition to be a fixed point in the history, it also includes the pipeline feature. It means we can keep relying on this one for a while without preventing new stuff to happen. (combination of spack pipelines with a child pipelines in gitlab will allow cool things in the next weeks/months).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
v0.13.4 is in fact too old (does not include latest conduit).
So I will try using the merge commit of this branch into develop. Then we can either keep the commit in the project.json, or use a clone is ascent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks cool!
Lets see how the test go and I'll think about a path forward.
One option may be to merge to a special branch we can use to test vs spack develop continually
@cyrush I re-introduced conduit package in the projeck. |
@cyrush I need to be added as "developer" (I’m not sure of the terminology) to create the mirroring in Gitlab. |
I see -- I think my latest spack package does have configure: https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/conduit/package.py can you see if that is compatible? |
Great, I’ll do the testing. |
@adrienbernede was there some change that increased the output of the spack builds? The travis jobs failed b/c they hit the max log file size. |
############################################################################## | ||
# Copyright (c) 2016-19, Lawrence Livermore National Security, LLC and Serac | ||
# project contributors. See the COPYRIGHT file for details. | ||
############################################################################## |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
switch to (2014-2020) and Conduit project contributors
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does it mean all the sources in conduit should switch to 2020, or is it done incrementally when the given file is modified?
Serac to Conduit: OK
# Copyright (c) 2016-19, Lawrence Livermore National Security, LLC and Umpire | ||
# project contributors. See the COPYRIGHT file for details. | ||
# | ||
# SPDX-License-Identifier: (MIT) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
again, switch to (2014-2020) and Conduit project contributors
We found that azure will tolerate log files of .5GB..... |
@cyrush The reason is because dev-build is verbose by default on dependencies build. I will make it quiet (there’s an option for it). |
3dffca0
to
f5e684e
Compare
is uberenv still copying the resulting host-config.cmake to the root uberenv_libs dir? (it seems that that is not the case and that is what is causing the travis ci error) |
I agree that this is what is happening. I am focusing on inconsistencies I have with the tests of Gitlab, but I will address that afterwards. I know the location changed several times when we worked on Serac. I want to remind the reasoning before taking any action. I also have an error related to python configuration. |
sounds good, I didn't catch that that had changed in uberenv. we do need the host-config file to exist in that location -- the primary way I configure when developing (across projects) involves using the file in that location. |
I understand. As far as I remember, we still had at least a link there. |
yes - symlink is fine |
Now I remember: |
in this case what is the project root directory? Most of our existing CI infrastructure is going to look for it in the prior location. (Also true for Ascent) |
|
In fact, what is happening is:
I could copy the generated host-config file from |
so the dev build build in directories outside of the prefix passed to ubrerenv? If you want to share uberenv_libs dirs between CI builds and the host-config.cmake could conflict, could your CI move the generated host config instead of changing the behavior of uberenv? |
Dev-build would build the dependencies in prefix, but perform the build at the source location, either in a spack-build directory for CMake or in source for Makefile. It makes sense since the idea is to build with the current version of the sources. Is that an issue? Maybe because it breaks the rule "everything generated by Spack goes in uberenv-libs"? I will find a way to preserve the original behavior. |
I see, the host config generation process (since it is part of the final package) will happen in a directory outside of the prefix. I didn't realize the behavior was changing If at all possible, I'd like to preserve the old semantics of uberenv: default mode: creates a working set of dependencies and a host config file in --prefix |
This is what I did in the last commit, with one detail: Remaining decision: Do you want to get rid of this I also have to address a limitation: Spack will refuse to |
thanks for your efforts on this -- you are immersed in the details, but I admin am a little lost with the changes. My hope is that we could use the newer spack features without changing really much of anything related to the workflow of uberenv. Including keeping any build artifacts limited to --prefix, etc. I don't have a good handle on the limitations of the dev-build etc. Are there any aspects of it might need to be address by spack folks before we should consider using it? |
It’s OK to deal with the details, and that’s no big changes we are talking about. It’s not a Spack limitation proper. And I with it was already the case before: |
Make spack build of dependencies quiet Copy the host-config file in uberenv-libs in build mode Removing spack-build directory Remove existing host-config file in install mode
781a8e7
to
62ab9e9
Compare
Update and clean branch |
NEXT: allow to capture output while printing it.
@cyrush with the latest changes, I’m addressing two things:
Notes on 1.: To do so, I had do change the behavior of the command executor sexe() in order to allow capturing the output while also displaying it. Notes on 2.: Paradoxaly, these new tests would fail since uberenv is currently newer in conduit than in uberenv repo... The next step is to add "what if I update spack". At the moment I don’t see any obstacle in introducing these tests, and my goal is to have them triggered only periodically and/or on changes in master. |
Also: I still need dev access to LLNL/conduit in order to setup mirroring on LC. |
@adrienbernede I added you to the Conduit Devs Team |
For the uberenv check, one way we have handled things like this (in other projects) is to record a hash for the current version in a text file. Then when you update the script you also have to update the hash or the test fails. This is a good strategy to prevent surprise updates, but it doesn't help you test and vet a new version of uberenv -- which I think is what you are aiming for? Like with spack, I think we will be intentional about when we update uberenv. Staying up-to-date with the latest and greatest isn't necessary, when we can always easily explicitly update and vet though our CI process. |
LGTM |
(Previous comment is a sign-off to trigger GitLab CI) |
The goal here is not to force the adoption of latest Uberenv or Spack, but to make it possible to test Conduit with both of them. It would allow to ease the "test and vet" phase and "document" how to do that. But it also means I will be able to trigger such a test from Uberenv CI (or Radiuss CI), to check if the last version of Uberenv or Spack would break something in Conduit. Even if this isn’t the best example, I think it’s an interesting demonstrator, and allows me to experience with the best way to do it in our context. (By the way, if you feel the need for such a cross-project integration between two Radiuss projects, just let me know. I would like to promote this for projects using submodules like BLT.) |
@adrienbernede that sounds good, knowing the answer to the "what if" we use the latest version will be helpful. |
Close and transferred to #517. |
In this PR I updated uberenv to the latest version (https://github.com/LLNL/uberenv/tree/master + LLNL/uberenv#20), and introduce a very basic Gitlab CI pipeline.
CI simply compiles 4 conduit targets using uberenv.
I will also include LLNL/uberenv#19, which should help preserve the current workflow.
This PR is ready for review.