Skip to content

Conversation

@hppritcha
Copy link
Member

Implement an almost-do-nothing alps oob component.
When using aprun to launch a job on Cray system,
there is no reason to need an oob system, since ompi
relies on Cray PMI for oob communication.

Fixes #484

Implement an almost-do-nothing alps oob component.
When using aprun to launch a job on Cray system,
there is no reason to need an oob system, since ompi
relies on Cray PMI for oob communication.

Fixes open-mpi#484
@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/371/

Build Log
last 50 lines

[...truncated 8292 lines...]
make[3]: Entering directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/orte/mca/odls/default'
make[3]: Nothing to be done for `install-exec-am'.
 /bin/mkdir -p '/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/share/openmpi'
 /usr/bin/install -c -m 644 help-orte-odls-default.txt '/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/share/openmpi'
 /bin/mkdir -p '/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/lib/openmpi'
 /bin/sh ../../../../libtool   --mode=install /usr/bin/install -c   mca_odls_default.la '/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/lib/openmpi'
libtool: install: /usr/bin/install -c .libs/mca_odls_default.so /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/lib/openmpi/mca_odls_default.so
libtool: install: /usr/bin/install -c .libs/mca_odls_default.lai /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/lib/openmpi/mca_odls_default.la
libtool: finish: PATH="/hpc/local/bin::/usr/local/bin:/bin:/usr/bin:/usr/sbin:/hpc/local/bin:/hpc/local/bin/:/hpc/local/bin/:/sbin:/usr/sbin:/bin:/usr/bin:/usr/local/sbin:/opt/ibutils/bin:/sbin" ldconfig -n /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/lib/openmpi
----------------------------------------------------------------------
Libraries have been installed in:
   /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/lib/openmpi

If you ever happen to want to link against installed libraries
in a given directory, LIBDIR, you must either use libtool, and
specify the full pathname of the library, or use the `-LLIBDIR'
flag during linking and do at least one of the following:
   - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
     during execution
   - add LIBDIR to the `LD_RUN_PATH' environment variable
     during linking
   - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
   - have your system administrator add LIBDIR to `/etc/ld.so.conf'

See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[3]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/orte/mca/odls/default'
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/orte/mca/odls/default'
Making install in mca/oob/alps
make[2]: Entering directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/orte/mca/oob/alps'
make[2]: *** No rule to make target `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/orte/mca/common/alps/libmca_common_alps.la', needed by `mca_oob_alps.la'.  Stop.
make[2]: *** Waiting for unfinished jobs....
  CC       oob_alps_component.lo
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/orte/mca/oob/alps'
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/orte'
make: *** [install-recursive] Error 1
Build step 'Execute shell' marked build as failure
TAP Reports Processing: START
Looking for TAP results report in workspace using pattern: **/*.tap
Did not find any matching files.
Anchor chain: could not read file with links: /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/jenkins_sidelinks.txt (No such file or directory)
[copy-to-slave] The build is taking place on the master node, no copy back to the master will take place.
Setting commit status on GitHub for https://github.com/open-mpi/ompi/commit/44be995bf03b0a0d39a4bfd73b9625dd7e5fe395
[BFA] Scanning build for known causes...

[BFA] Done. 0s
Setting status of b1f31a436413649ffeee8e1e3593d90613aa054d to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/371/ and message: Merged build finished.

Test FAILed.

Have to have alps rpms installed on a system
for alps component to build, even if separated
by a level of indirection.

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/373/
Test PASSed.

hppritcha added a commit that referenced this pull request Mar 20, 2015
orte/oob: implement alps oob component
@hppritcha hppritcha merged commit 990e9b4 into open-mpi:master Mar 20, 2015
@hppritcha hppritcha deleted the topic/issue_484 branch April 17, 2015 17:53
jsquyres pushed a commit to jsquyres/ompi that referenced this pull request Nov 10, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

cray native launch broken

2 participants