Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenHPC repos #37

Closed
lwilson opened this issue Mar 20, 2020 · 9 comments
Closed

OpenHPC repos #37

lwilson opened this issue Mar 20, 2020 · 9 comments
Assignees
Labels
enhancement New feature or request stale This issue is in danger of being automatically closed
Milestone

Comments

@lwilson
Copy link
Collaborator

lwilson commented Mar 20, 2020

Is your feature request related to a problem? Please describe.
Use OpenHPC RPM repositories to provide HPC libraries and tools

Describe the solution you'd like
Enable the OpenHPC RPM repos in /etc/yum.repos.d/

Describe alternatives you've considered
We could rebuild the packages, but seems like duplicating work

Additional context
N/A

@lwilson lwilson added the enhancement New feature or request label Mar 20, 2020
@lwilson lwilson assigned lwilson and unassigned j0hnL Jun 3, 2020
@lwilson lwilson added this to the v0.2 milestone Jun 3, 2020
@lwilson
Copy link
Collaborator Author

lwilson commented Jun 3, 2020

I'll take this one. I think we can integrate openHPC as a separate playbook.

Let's provide some documentation mentioning that it currently provides good MPI performance for OFED-based networks, but does not yet have UCX-based support for RoCE/IB with Mellanox. This will be fixed with openHPC 2.0 (openhpc/ohpc#1203).

@lwilson lwilson modified the milestones: v0.2, v0.3 Jun 3, 2020
@lwilson
Copy link
Collaborator Author

lwilson commented Jun 8, 2020

OHPC 2.0 repos are now available on OBS (http://obs.openhpc.community:82/OpenHPC:/2.0:/Factory/).

It appears that all RPMs are built for CentOS8. @j0hnL should we consider a base OS version bump?

@lwilson
Copy link
Collaborator Author

lwilson commented Jun 29, 2020

The .repo file is at http://obs.openhpc.community:82/OpenHPC:/2.0:/Factory/CentOS_8/OpenHPC:2.0:Factory.repo

[OpenHPC_2.0_Factory]
name=Rolling development build (CentOS_8)
type=rpm-md
baseurl=http://obs.openhpc.community:82/OpenHPC:/2.0:/Factory/CentOS_8/
gpgcheck=1
gpgkey=http://obs.openhpc.community:82/OpenHPC:/2.0:/Factory/CentOS_8/repodata/repomd.xml.key
enabled=1

@lwilson
Copy link
Collaborator Author

lwilson commented Oct 20, 2020

OpenHPC 2.0 is now officially released. The repo is available at: http://repos.openhpc.community/OpenHPC/2/CentOS_8/

@lwilson lwilson removed this from the v0.3 milestone Oct 29, 2020
@stale
Copy link

stale bot commented Apr 29, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale This issue is in danger of being automatically closed label Apr 29, 2021
@j0hnL
Copy link
Collaborator

j0hnL commented Apr 29, 2021

rather than integrating the repos we should look into using the OHPC container: https://github.com/openhpc/ohpc/tree/2.x/containers

@stale stale bot removed the stale This issue is in danger of being automatically closed label Apr 29, 2021
@j0hnL j0hnL added this to the v1.1.0 milestone Apr 29, 2021
@lwilson
Copy link
Collaborator Author

lwilson commented May 3, 2021

Completely agree @j0hnL. It also appears that we can take Slurm directly from epel-release: https://centos.pkgs.org/8/epel-x86_64/slurm-20.11.5-1.el8.x86_64.rpm.html

Since this is the case, I would suggest we use the epel-release version of Slurm, and then use Singularity (#300) to deploy NGC, oneContainer, OpenHPC containers for user applications.

@stale
Copy link

stale bot commented Jul 2, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale This issue is in danger of being automatically closed label Jul 2, 2021
@lwilson
Copy link
Collaborator Author

lwilson commented Jul 6, 2021

@j0hnL I'm closing this ticket. Since we are deploying the OpenHPC application environment via container, I think we can consider this task complete.

@lwilson lwilson closed this as completed Jul 6, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request stale This issue is in danger of being automatically closed
Projects
None yet
Development

No branches or pull requests

2 participants