Skip to content

Conversation

@jithinjosepkl
Copy link
Contributor

  • Set convertor pDesc and count in OPAL_CONVERTOR_PREPARE (including cases where count = 0)
  • Do opal_convertor_copy_and_prepare_for_send for buffered send mode as MCA_PML_CM_HVY_SEND_REQUEST_BSEND_ALLOC calls opal_convertor_pack directly

@bosilca - Please review.

where count = 0)

Signed-off-by: Jithin Jose <jithin.jose@intel.com>
@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/job/gh-ompi-master-pr/621/

Build Log
last 50 lines

[...truncated 13714 lines...]
pml_cm.h:227: error: (Each undeclared identifier is reported only once
pml_cm.h:227: error: for each function it appears in.)
In file included from pml_cm_request.c:21:
pml_cm.h: In function 'mca_pml_cm_isend_init':
pml_cm.h:227: error: implicit declaration of function 'MCA_PML_CM_HVY_SEND_REQUEST_INIT'
pml_cm.h:227: error: 'ompi_proc' undeclared (first use in this function)
pml_cm.h:227: error: (Each undeclared identifier is reported only once
pml_cm.h:227: error: for each function it appears in.)
pml_cm.h: In function 'mca_pml_cm_isend':
pml_cm.h:256: error: 'ompi_proc' undeclared (first use in this function)
pml_cm.h: In function 'mca_pml_cm_isend':
pml_cm.h:256: error: 'ompi_proc' undeclared (first use in this function)
In file included from pml_cm_sendreq.c:21:
pml_cm.h: In function 'mca_pml_cm_isend_init':
pml_cm.h:227: error: implicit declaration of function 'MCA_PML_CM_HVY_SEND_REQUEST_INIT'
pml_cm.h:227: error: 'ompi_proc' undeclared (first use in this function)
pml_cm.h:227: error: (Each undeclared identifier is reported only once
pml_cm.h:227: error: for each function it appears in.)
pml_cm.h: In function 'mca_pml_cm_isend':
pml_cm.h:256: error: 'ompi_proc' undeclared (first use in this function)
In file included from pml_cm_start.c:26:
pml_cm.h: In function 'mca_pml_cm_isend_init':
pml_cm.h:227: error: implicit declaration of function 'MCA_PML_CM_HVY_SEND_REQUEST_INIT'
pml_cm.h:227: error: 'ompi_proc' undeclared (first use in this function)
pml_cm.h:227: error: (Each undeclared identifier is reported only once
pml_cm.h:227: error: for each function it appears in.)
pml_cm.h: In function 'mca_pml_cm_isend':
pml_cm.h:256: error: 'ompi_proc' undeclared (first use in this function)
make[2]: *** [pml_cm_cancel.lo] Error 1
make[2]: *** Waiting for unfinished jobs....
make[2]: *** [pml_cm_component.lo] Error 1
make[2]: *** [pml_cm_request.lo] Error 1
make[2]: *** [pml_cm.lo] Error 1
make[2]: *** [pml_cm_start.lo] Error 1
make[2]: *** [pml_cm_recvreq.lo] Error 1
make[2]: *** [pml_cm_sendreq.lo] Error 1
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/ompi/mca/pml/cm'
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/ompi'
make: *** [install-recursive] Error 1
Build step 'Execute shell' marked build as failure
[htmlpublisher] Archiving HTML reports...
[htmlpublisher] Archiving at BUILD level /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/cov_build to /var/lib/jenkins/jobs/gh-ompi-master-pr/builds/621/htmlreports/Coverity_Report
Setting commit status on GitHub for https://github.com/open-mpi/ompi/commit/94d15be88feb99a5966f0db9ef8544d8a5f52357
[BFA] Scanning build for known causes...
[BFA] No failure causes found
[BFA] Done. 0s
Setting status of 8876976f83efc76cb79977b336ed0cecae4b5895 to FAILURE with url http://bgate.mellanox.com:8888/jenkins/job/gh-ompi-master-pr/621/ and message: 'Build finished.'
Using conext: Mellanox

@lanl-ompi
Copy link
Contributor

Refer to this link for build results (access rights to CI server needed):
http://jenkins.open-mpi.org/job/ompi_master_pr_cle5.2up02/131/

Build Log
last 20 lines

[...truncated 9506 lines...]
pml_cm_sendreq.h:273:46: note: in definition of macro 'MCA_PML_CM_HVY_SEND_REQUEST_INIT'
                                              ompi_proc,                 \
                                              ^
pml_cm.h: In function 'mca_pml_cm_isend':
pml_cm.h:257:42: error: 'ompi_proc' undeclared (first use in this function)
                                          ompi_proc, 
                                          ^
pml_cm_sendreq.h:273:46: note: in definition of macro 'MCA_PML_CM_HVY_SEND_REQUEST_INIT'
                                              ompi_proc,                 \
                                              ^
make[2]: *** [pml_cm_component.lo] Error 1
cc1: some warnings being treated as errors
make[2]: *** [pml_cm_recvreq.lo] Error 1
make[2]: Leaving directory `/global/u2/h/hpp/jenkins_edison/workspace/ompi_master_pr_cle5.2up02/ompi/mca/pml/cm'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/global/u2/h/hpp/jenkins_edison/workspace/ompi_master_pr_cle5.2up02/ompi'
make: *** [all-recursive] Error 1
Build step 'Execute shell' marked build as failure
Setting status of 8876976f83efc76cb79977b336ed0cecae4b5895 to FAILURE with url http://jenkins.open-mpi.org/job/ompi_master_pr_cle5.2up02/131/ and message: Build finished.

Test FAILed.

@lanl-ompi
Copy link
Contributor

Refer to this link for build results (access rights to CI server needed):
http://jenkins.open-mpi.org/job/ompi_master_pr_cle5.2up02/132/
Test PASSed.

@mellanox-github
Copy link

@mellanox-github
Copy link

Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/job/gh-ompi-master-pr/622/

@hppritcha
Copy link
Member

bot:retest

@lanl-ompi
Copy link
Contributor

Refer to this link for build results (access rights to CI server needed):
http://jenkins.open-mpi.org/job/ompi_master_pr_distcheck/41/
Test PASSed.

@lanl-ompi
Copy link
Contributor

Refer to this link for build results (access rights to CI server needed):
http://jenkins.open-mpi.org/job/ompi_master_pr_cle5.2up02/134/
Test PASSed.

@jithinjosepkl
Copy link
Contributor Author

@bosilca - the build error was due to a last minute macro name change. Looks sanity tests are fine now.
Can you please take a look?

@bosilca
Copy link
Member

bosilca commented Jun 11, 2015

Overall it looks OK. However, I wonder why you need the changes in the convertor, as if the local_size of zero the convertor should not be used in any case.

@jithinjosepkl
Copy link
Contributor Author

@bosilca - Thanks.

In ompi_mtl_datatype_pack, we check for these two fields (datatype and count) while invoking opal_datatype_is_contiguous_memory_layout. Earlier, these fields were not set in OPAL_CONVERTOR_PREPARE macro if (0 == count) || (0 == datatype->size).

@hppritcha
Copy link
Member

If you want this merged in to master before fork it should be checked in today or latest tomorrow.

@jithinjosepkl
Copy link
Contributor Author

@hppritcha - Thanks for the reminder. Yes I would like to.
@bosilca - Any further comments?

@bosilca
Copy link
Member

bosilca commented Jun 15, 2015

A convertor without anything to do is marked with the special flag COMPLETED. You might want to use that as a marker for the fact that the MTL has nothing to do. Otherwise I have no other comments.

MCA_PML_CM_HVY_SEND_REQUEST_BSEND_ALLOC calls opal_convertor_pack
directly.

Signed-off-by: Jithin Jose <jithin.jose@intel.com>
@jithinjosepkl
Copy link
Contributor Author

bot:retest

1 similar comment
@rhc54
Copy link
Contributor

rhc54 commented Jun 16, 2015

bot:retest

rhc54 pushed a commit that referenced this pull request Jun 16, 2015
@rhc54 rhc54 merged commit 9a8bda0 into open-mpi:master Jun 16, 2015
jsquyres added a commit to jsquyres/ompi that referenced this pull request Nov 10, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants