Telcon: 2024 03 27
Peter Scheibel edited this page Mar 27, 2024
·
14 revisions
Wednesday March 27th, 9am PT (UTC -7:00)
- Peter Scheibel (host)
- Davide DelVento
- Bernhard
- Jakov Petrina
(This week the meeting is for Q&A, there are no pre-planned topics)
- Davide: how to add -L flags to a package build?
- Peter: is this a case where you have a dependency with a directory that you want recognized as a library directory?
- If so, you can implement
.libs
in the dependency- See: https://spack.readthedocs.io/en/latest/packaging_guide.html#blas-lapack-and-scalapack-libraries
- See: https://spack.readthedocs.io/en/latest/packaging_guide.html#custom-attributes
- Peter: neither of these fully explain that every file returned by this is automatically converted into a
-L
- There are examples of
.libs
in the builtin packages (generally they usefind_libraries
)
- If so, you can implement
- Davide, in my case I needed to add
-D...
CMake options- Could there be documentation in Spack about the recommendations for this
- e.g. how to influence CMake's
find_package
- Davide: no CI for testing on MacOS
- Peter: is this a case where you have a dependency with a directory that you want recognized as a library directory?
- Bernhard: CI is building E4S environment
- Also, what about packages outside of this set?
- i.e. not every package that is changed is built
- Peter: yes - only packages mentioned in
share/spack/gitlab/cloud_pipelines
(or those packages they depend on) are built as part of CI
- Can we dynamically build everything?
- Not every variant combination, but rather, one configuration of each package
- This could be time consuming
- Bernhard: maybe it would be less time-consuming to generate a build dynamically for a package PR
- Peter: one off instances of builds can suffer from drift; also, users may not be willing in general to fix build problems w/other users
- There is also a potential security issue w/ generating builds dynamically
- Difficult to create code coverage of PR based on this
- Also, what about packages outside of this set?
- Davide: can we talk about the possibility of outside compute sites running pipelines and submitting results to PRs to verify that they work?
- Peter: e.g. this could push a PR to a site and they build on it (so instead of Spack running build from PR, we alert the external site to the PR's existence)
- Bernhard: Can we automate aspects of Python package review?
- Harmen is working on automatically generating the updates themselves, and that same logic might be useful for verifying
- Conda has Grayskull