Skip to content
Massimiliano Culpo edited this page Dec 7, 2023 · 6 revisions

Wednesday December 6th, 9am PT (UTC -8:00)

Attendees

  • Massimiliano Culpo (host)
  • Nick Romero (Samsung)
  • Pariksheet Nanda
  • Jakov Petrina
  • Dom Heinzeller
  • Prentice Brisbal
  • Chris Green

Agenda

  • Nick: There's been previous talk about bootstrapping issues on RISC-V. The corresponding issue is https://github.com/spack/spack/issues/38622 The main problem ends up being system python looking in the wrong folder, and not being able to load the clingo native extension. If the problem ends up being clingo, would pip install clingo solve these issues?

    • Massimiliano: Yes, that would avoid the need for bootstrapping (if clingo is packaged for RISC-V). There is https://github.com/spack/spack/pull/40773 that tries to interpose a virtual environment between the actual Python being used and any Python package. That should help us treating tree structures in a more homogeneous way and avoid special-casing external Python executable that are e.g. coming from a specific distro.
    • Nick: is CI passing on that PR?
    • Massimiliano: CI is passing, but the change has a wide impact on the code base, so we are trying to test it thoroughly.
  • Prentice: I'm building an environment with an external Python, and I end up with two different Python instances. Is there a way to avoid that? My environment is currently unify:false, but ideally it should be "unified per compiler being used".

    • Massimiliano: Using unify:false might lead to having duplicates for certain packages, since each root spec is concretized on its own. We are currently considering more fined grained control over unification than true/false, but the feature is not there yet. As a workaround you can either try to over specify / pin specs, or try to use different unified environments and collect the common parts in YAML files which are included in each (this is a bit unwieldy).
    • Prentice: What if I build python with my core compiler and put the caret symbol in my matrices to use that python build with my core compiler as a dependency? Will that cause an error if nothing depends on python?
    • Massimiliano: Currently the behavior is different depending on how the environment is unified. We have a bug open on that: https://github.com/spack/spack/issues/40791 With unify:false you'll get in an inconsistent state where the environment will not error, but the dependency will still show up in your root spec.
  • Prentice: If I encounter a package that is out of date, or with wrong recipe, what is the best way to fix that / interact with maintainers?

    • Massimiliano: The best way is in my opinion to submit a PR. In most cases maintainers are responsive and helpful, and you can interact with them until PR is merged. If a maintainer is silent, core devs or people with merge rights will kick in after a grace period (usually ~1 week).
  • Chris: (Reported an error in CI PermissionError: [Errno 13] Permission denied: '/home/software' that wasn't related to the changes in PR. The report was forwarded to Gitlab CI team)

  • Prentice: (Reported an issue on spack config edit failing to open a file with YAML syntax errors, see https://github.com/spack/spack/issues/41470)

  • Pariksheet: Discussing about Package review / merge permissions. What do you expect people to know when handling these? Are there different teams for certain categories of packages or languages of packages?

    • Massimiliano: This varies a lot depending on maintainers and packages we're talking about. We have teams in the GitHub sense for reviewers with merge to develop permissions. Regular "maintainers" don't have merge permissions, but are asked for a review whenever the package they maintain has changes. What we usually expect from maintainers is a certain knowledge of Spack domain specific language for packages, and a domain knowledge of the package they maintain. There is no fixed workflow, so we have people signing up for a single package and people signing up for an entire ecosystem (like Adam J. Stewart with Python packages). Getting merge privileges to develop usually goes through Todd.
    • Chris Green: Speaking as a contributor, I add myself as a maintainer for packages I care about for work. I follow the same procedure that many people do. If I have time I'll look at it, otherwise I'll assume it's okay till I find out it's not.
    • Massimiliano: As core developers, we have no written rules or fixed procedures. The rule of thumb is always to give 1 week for maintainers to look at PRs. But if, for any reason, a maintainer is not responsive, we proceed with the review and merge.
  • Dom: (Pointed to an interesting discussion on Python "epocs" in versions https://github.com/spack/spack/pull/41415#issuecomment-1843208278 That is an idea that might be interesting also to Spack, for packages thata change their version format)

Items for next week (delayed again)

  • (delayed) Spack PR reviewer onboarding process:
    • Currently, maintainers are invited to get triage permissions on PRs
    • Do we have other mechanisms for adding reviewers?
    • What about for merge permissions?
Clone this wiki locally