Skip to content

Conversation

@hppritcha
Copy link
Member

This commit fixes things broken by commit
ea35e47.

Fixes #616

Signed-off-by: Howard Pritchard howardp@lanl.gov

This commit fixes things broken by commit
ea35e47.

Fixes open-mpi#616

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
@lanl-ompi
Copy link
Contributor

Refer to this link for build results (access rights to CI server needed):
http://jenkins.open-mpi.org/job/hopper_ompi_master/75/

Build Log
last 20 lines

[...truncated 5212 lines...]
ln: failed to create symbolic link `pwin_attach_f.c': File exists
ln: failed to create symbolic link `pwin_detach_f.c': File exists
make[3]: *** [pwin_attach_f.c] Error 1
make[3]: *** Waiting for unfinished jobs....
make[3]: *** [pwin_detach_f.c] Error 1
ln: failed to create symbolic link `pwin_create_dynamic_f.c': File exists
make[3]: *** [pwin_create_dynamic_f.c] Error 1
  LN_S     pwin_get_info_f.c
ln: failed to create symbolic link `pwin_get_info_f.c': File exists
make[3]: *** [pwin_get_info_f.c] Error 1
make[3]: Leaving directory `/global/u2/h/hpp/jenkins_hopper/workspace/hopper_ompi_master/ompi/mpi/fortran/mpif-h/profile'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/global/u2/h/hpp/jenkins_hopper/workspace/hopper_ompi_master/ompi/mpi/fortran/mpif-h'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/global/u2/h/hpp/jenkins_hopper/workspace/hopper_ompi_master/ompi'
make: *** [all-recursive] Error 1
Build step 'Execute shell' marked build as failure
GCM: Sending notification to: hpp
Setting status of 05325b113ef9db34a2fc9cd83c94f4aaf0cef81f to FAILURE with url http://jenkins.open-mpi.org/job/hopper_ompi_master/75/ and message: Build finished.

Test FAILed.

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/579/

Build Log
last 50 lines

[...truncated 12037 lines...]
  CC       punpack_external_f.lo
  CC       punpack_f.lo
  CC       punpublish_name_f.lo
  CC       pwaitall_f.lo
  CC       pwaitany_f.lo
  CC       pwait_f.lo
  CC       pwaitsome_f.lo
  CC       pwtick_f.lo
  CC       pwtime_f.lo
  CC       paccumulate_f.lo
  CC       praccumulate_f.lo
  CC       pget_f.lo
  CC       prget_f.lo
  CC       pget_accumulate_f.lo
  CC       prget_accumulate_f.lo
  CC       pput_f.lo
  CC       prput_f.lo
  CC       pcompare_and_swap_f.lo
  CC       pfetch_and_op_f.lo
  CC       pwin_allocate_f.lo
  CC       pwin_allocate_shared_f.lo
  CC       pwin_attach_f.lo
  CC       pwin_call_errhandler_f.lo
  CC       pwin_complete_f.lo
  CC       pwin_create_dynamic_f.lo
  CC       pwin_create_errhandler_f.lo
  CC       pwin_create_f.lo
  CC       pwin_create_keyval_f.lo
gcc: pwin_attach_f.c: No such file or directory
gcc: no input files
make[3]: *** [pwin_attach_f.lo] Error 1
make[3]: *** Waiting for unfinished jobs....
gcc: pwin_create_dynamic_f.c: No such file or directory
gcc: no input files
make[3]: *** [pwin_create_dynamic_f.lo] Error 1
make[3]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/ompi/mpi/fortran/mpif-h/profile'
make[2]: *** [install-recursive] Error 1
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/ompi/mpi/fortran/mpif-h'
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/ompi'
make: *** [install-recursive] Error 1
Build step 'Execute shell' marked build as failure
[htmlpublisher] Archiving HTML reports...
[htmlpublisher] Archiving at BUILD level /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/cov_build to /var/lib/jenkins/jobs/gh-ompi-master-pr/builds/579/htmlreports/Coverity_Report
Setting commit status on GitHub for https://github.com/open-mpi/ompi/commit/65d7918eb4837e2d62366389230113cbb67b6df3
[BFA] Scanning build for known causes...
[BFA] No failure causes found
[BFA] Done. 0s
Setting status of 05325b113ef9db34a2fc9cd83c94f4aaf0cef81f to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/579/ and message: Build finished.

Test FAILed.

@mike-dubman
Copy link
Member

@hppritcha - how do you set name "LANL - Cray XC" for jenkins? :)

@rhc54
Copy link
Contributor

rhc54 commented Jun 2, 2015

This has nothing to do with the Cray - the master is broken for everyone:

gcc: error: pwin_attach_f.c: No such file or directory
gcc: fatal error: no input files
compilation terminated.
make[3]: *** [pwin_attach_f.lo] Error 1
make[3]: *** Waiting for unfinished jobs....
gcc: error: pwin_create_dynamic_f.c: No such file or directory
gcc: error: pwin_detach_f.c: No such file or directory
gcc: fatal error: no input files
compilation terminated.
gcc: fatal error: no input files
compilation terminated.
make[3]: *** [pwin_detach_f.lo] Error 1
make[3]: *** [pwin_create_dynamic_f.lo] Error 1
gcc: error: pwin_get_info_f.c: No such file or directory
gcc: fatal error: no input files
compilation terminated.
make[3]: *** [pwin_get_info_f.lo] Error 1
make[2]: *** [all-recursive] Error 1
make[1]: *** [all-recursive] Error 1

@rhc54
Copy link
Contributor

rhc54 commented Jun 2, 2015

Looks like @ggouaillardet forgot to "git add" the new files :-(

@hppritcha
Copy link
Member Author

bot:retest

1 similar comment
@hppritcha
Copy link
Member Author

bot:retest

@hppritcha
Copy link
Member Author

2015-06-02 6:32 GMT-06:00 Mike Dubman notifications@github.com:

@hppritcha https://github.com/hppritcha - how do you set name "LANL -
Cray XC" for jenkins? :)

In the "Commit Status Context" field of the github pull request builder
for the project being used to smoke test ompi-master/ompi PR's
fill in something that describes what you'd like to appear in the
status field for the PR.

If it helps we could add Eugene to the admins for the openmpi
jenkins server at IU.

I accidentally had two github pull request builder projects going.
Right now I've restricted testing to a whitelist with my name only,
but I'll add members of the group today.

Of course I didn't find out the brokeness of the ompi for cray
problem this way - since the releveant commit was checked
directly in to master. Rather I found out while on travel in
Berlin thanks to my android app, which is wired up to send
me a notification when someone has committed something
that breaks the build on NERSC cray systems.

For that issue, I'm setting up a second project to build after
commits and then send a mail to the author of the commit
if the build on edison fails. That would have also have caught
the current problem and at least have sent the developer an
email notification.

Reply to this email directly or view it on GitHub
#617 (comment).

@lanl-ompi
Copy link
Contributor

Refer to this link for build results (access rights to CI server needed):
http://jenkins.open-mpi.org/job/hopper_ompi_master/76/

Build Log
last 20 lines

[...truncated 5174 lines...]
  LN_S     pwin_get_info_f.c
ln: failed to create symbolic link `pwin_attach_f.c': File exists
make[3]: *** [pwin_attach_f.c] Error 1
make[3]: *** Waiting for unfinished jobs....
ln: failed to create symbolic link `pwin_create_dynamic_f.c': File exists
make[3]: *** [pwin_create_dynamic_f.c] Error 1
ln: failed to create symbolic link `pwin_get_info_f.c': File exists
ln: failed to create symbolic link `pwin_detach_f.c': File exists
make[3]: *** [pwin_get_info_f.c] Error 1
make[3]: *** [pwin_detach_f.c] Error 1
make[3]: Leaving directory `/global/u2/h/hpp/jenkins_hopper/workspace/hopper_ompi_master/ompi/mpi/fortran/mpif-h/profile'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/global/u2/h/hpp/jenkins_hopper/workspace/hopper_ompi_master/ompi/mpi/fortran/mpif-h'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/global/u2/h/hpp/jenkins_hopper/workspace/hopper_ompi_master/ompi'
make: *** [all-recursive] Error 1
Build step 'Execute shell' marked build as failure
GCM: Sending notification to: hpp
Setting status of 05325b113ef9db34a2fc9cd83c94f4aaf0cef81f to FAILURE with url http://jenkins.open-mpi.org/job/hopper_ompi_master/76/ and message: Build finished.

Test FAILed.

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/580/

Build Log
last 50 lines

[...truncated 12035 lines...]
  CC       ptype_ub_f.lo
  CC       ptype_vector_f.lo
  CC       punpack_external_f.lo
  CC       punpack_f.lo
  CC       punpublish_name_f.lo
  CC       pwaitall_f.lo
  CC       pwaitany_f.lo
  CC       pwait_f.lo
  CC       pwaitsome_f.lo
  CC       pwtick_f.lo
  CC       pwtime_f.lo
  CC       paccumulate_f.lo
  CC       praccumulate_f.lo
  CC       pget_f.lo
  CC       prget_f.lo
  CC       pget_accumulate_f.lo
  CC       prget_accumulate_f.lo
  CC       pput_f.lo
  CC       prput_f.lo
  CC       pcompare_and_swap_f.lo
  CC       pfetch_and_op_f.lo
  CC       pwin_allocate_f.lo
  CC       pwin_allocate_shared_f.lo
  CC       pwin_attach_f.lo
  CC       pwin_call_errhandler_f.lo
  CC       pwin_complete_f.lo
  CC       pwin_create_dynamic_f.lo
  CC       pwin_create_errhandler_f.lo
gcc: pwin_attach_f.c: No such file or directory
gcc: no input files
make[3]: *** [pwin_attach_f.lo] Error 1
make[3]: *** Waiting for unfinished jobs....
gcc: pwin_create_dynamic_f.c: No such file or directory
gcc: no input files
make[3]: *** [pwin_create_dynamic_f.lo] Error 1
make[3]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/ompi/mpi/fortran/mpif-h/profile'
make[2]: *** [install-recursive] Error 1
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/ompi/mpi/fortran/mpif-h'
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/ompi'
make: *** [install-recursive] Error 1
Build step 'Execute shell' marked build as failure
[htmlpublisher] Archiving HTML reports...
[htmlpublisher] Archiving at BUILD level /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/cov_build to /var/lib/jenkins/jobs/gh-ompi-master-pr/builds/580/htmlreports/Coverity_Report
Setting commit status on GitHub for https://github.com/open-mpi/ompi/commit/65d7918eb4837e2d62366389230113cbb67b6df3
[BFA] Scanning build for known causes...
[BFA] No failure causes found
[BFA] Done. 0s
Setting status of 05325b113ef9db34a2fc9cd83c94f4aaf0cef81f to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/580/ and message: Build finished.

Test FAILed.

@hppritcha
Copy link
Member Author

one more test to check jenkins setup.
bot:retest

@lanl-ompi
Copy link
Contributor

Refer to this link for build results (access rights to CI server needed):
http://jenkins.open-mpi.org/job/hopper_ompi_master/77/

Build Log
last 20 lines

[...truncated 4783 lines...]
  CC       libmca_common_verbs_la-common_verbs_basics.lo
  CC       libmca_common_verbs_la-common_verbs_fake.lo
  CC       libmca_common_verbs_la-common_verbs_devlist.lo
  CC       libmca_common_verbs_la-common_verbs_find_max_inline.lo
  CC       libmca_common_verbs_la-common_verbs_find_ports.lo
  CC       libmca_common_verbs_la-common_verbs_mca.lo
  CC       libmca_common_verbs_la-common_verbs_port.lo
  CC       libmca_common_verbs_la-common_verbs_qp_type.lo
  LN_S     libmca_common_verbs.la
  CCLD     libmca_common_verbs.la
gcc: /usr/lib64/libosmcomp.so: No such file or directory
make[2]: *** [libmca_common_verbs.la] Error 1
make[2]: Leaving directory `/global/u2/h/hpp/jenkins_hopper/workspace/hopper_ompi_master/opal/mca/common/verbs'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/global/u2/h/hpp/jenkins_hopper/workspace/hopper_ompi_master/opal'
make: *** [all-recursive] Error 1
Build step 'Execute shell' marked build as failure
GCM: Sending notification to: hpp
Setting status of 05325b113ef9db34a2fc9cd83c94f4aaf0cef81f to FAILURE with url http://jenkins.open-mpi.org/job/hopper_ompi_master/77/ and message: Build finished.

Test FAILed.

@ggouaillardet
Copy link
Contributor

oops !
I did forget to git add some files indeed, I am very sorry about that.
being afk, I will fix this tomorrow, fell free to revert my commits in the mean time

@lanl-ompi
Copy link
Contributor

Refer to this link for build results (access rights to CI server needed):
http://jenkins.open-mpi.org/job/ompi_master_pr_cle5.2up02/78/

Build Log
last 20 lines

[...truncated 5679 lines...]
/opt/cray/alps/5.2.3-2.0502.9295.14.14.ari/include/alps/libalpslli.h:21:2: warning: #ident is a GCC extension
 #ident "$Id: libalpslli.h 9100 2014-10-13 20:41:14Z jhn $"
  ^
odls_alps_module.c: In function 'orte_odls_alps_launch_local_procs':
odls_alps_module.c:756:5: error: implicit declaration of function 'pmix_server_create_shared_segment' [-Werror=implicit-function-declaration]
     attr = pmix_server_create_shared_segment(job);
     ^
odls_alps_module.c:756:10: warning: assignment makes pointer from integer without a cast
     attr = pmix_server_create_shared_segment(job);
          ^
cc1: some warnings being treated as errors
make[2]: *** [mca_odls_alps_la-odls_alps_module.lo] Error 1
make[2]: Leaving directory `/global/u2/h/hpp/jenkins_edison/workspace/ompi_master_pr_cle5.2up02/orte/mca/odls/alps'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/global/u2/h/hpp/jenkins_edison/workspace/ompi_master_pr_cle5.2up02/orte'
make: *** [all-recursive] Error 1
Build step 'Execute shell' marked build as failure
GCM: Sending notification to: hpp
Setting status of 05325b113ef9db34a2fc9cd83c94f4aaf0cef81f to FAILURE with url http://jenkins.open-mpi.org/job/ompi_master_pr_cle5.2up02/78/ and message: Build finished.

Test FAILed.

@hppritcha
Copy link
Member Author

closing this PR and retrying again to make sure jenkins at IU is working like we want

@hppritcha hppritcha closed this Jun 2, 2015
@hppritcha hppritcha deleted the topic/fix_busted_cray_build branch October 15, 2015 17:52
jsquyres added a commit to jsquyres/ompi that referenced this pull request Nov 10, 2015
…oc_complete_init

ompi_proc_complete_init: always reset u16ptr
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

commit ea35e47 breaks cray build

6 participants