Skip to content

Conversation

@bosilca
Copy link
Member

@bosilca bosilca commented Feb 23, 2016

Add support for packing into external32 format.

The datatype must satisfy the same constraints as for the
corresponding communication function (send for pack and
recv for unpack).
provide external32 support). Add a pack function allowing to
provide send conversion (needed on little endian machine in
order to pack in the external32 format).
@lanl-ompi
Copy link
Contributor

Test FAILed.

@ggouaillardet
Copy link
Contributor

@bosilca i will check that tomorrow
note make check fails, did you forget to push something ?

about make check, you only test the effect of pack/unpack
since external32 is a portable type, shouldn't you hardcore the value expected to be packed
(e.g. check it is written in big endian) and/or unpack some hard coded big endian bytes ?

about MPI_Pack in heterogeneous cluster, should it behave like MPI_Pack_external ?
so it can be MPI_Send (..., MPI_PACKED, ...) to any arch ?
that could be helpful to optimize a collective algorithm we discussed years ago

Use htonl and htons to check that the packed data is
indeed the correct data.
@lanl-ompi
Copy link
Contributor

Test FAILed.

@ggouaillardet
Copy link
Contributor

@bosilca test/datatype/external32 is an MPI application, so it cannot be used in make check unless you previously did a make install (and you might have to configure with --enable-mpirun-prefix-by-default or have the installed OpenMPI in your LD_LIBRARY_PATH too

one option is to only build this test, and an other one is to make this a non MPI application (such as unpack_ooo)

i ll try to give this a try when i get some time

@bosilca
Copy link
Member Author

bosilca commented Feb 24, 2016

@ggouaillardet the Travis failure was unrelated to this.

I added a validation of the packed data using hton[sl], and this seems to work.

MPI_Pack is different than MPI_Pack_external in the sense that MPI_Pack is allowed to prepend the buffer

@ggouaillardet
Copy link
Contributor

@bosilca ok for MPI_Pack vs MPI_Pack_external

i disagree about the travis issue. i added some logging and here is the output

--------------------------------------------------------------------------
Sorry!  You were supposed to get help about:
    opal_init:startup:internal-failure
But I couldn't open the help file:
    /home/travis/bogus/share/openmpi/help-opal-runtime.txt: No such file or directory.  Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry!  You were supposed to get help about:
    orte_init:startup:internal-failure
But I couldn't open the help file:
    /home/travis/bogus/share/openmpi/help-orte-runtime: No such file or directory.  Sorry!
--------------------------------------------------------------------------
--------------------------------------------------------------------------
Sorry!  You were supposed to get help about:
    mpi_init:startup:internal-failure
But I couldn't open the help file:
    /home/travis/bogus/share/openmpi/help-mpi-runtime.txt: No such file or directory.  Sorry!
--------------------------------------------------------------------------
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
***    and potentially your MPI job)
[testing-gce-7f419fa5-ad7b-4144-80a5-e48b29711dc1:28217] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
1

that might not occur if you make install and have your LD_LIBRARY_PATH set and/or you configure'd with --disable-dlopen
i made #1399 to double check that
my changes are in ggouaillardet@4bdf917

@ggouaillardet
Copy link
Contributor

@bosilca this is the updated and working commit ggouaillardet@f87f75a

travis url is at https://travis-ci.org/open-mpi/ompi/builds/111402637

feel tree to close #1399 at any time

@bosilca
Copy link
Member Author

bosilca commented Mar 29, 2016

@ggouaillardet I cherry-picked your 2 commits ggouaillardet/ompi@f87f75a and ggouaillardet/ompi@7279472. We might want to close the PR #1399 and focus here instead.

@lanl-ompi
Copy link
Contributor

Test FAILed.

1 similar comment
@lanl-ompi
Copy link
Contributor

Test FAILed.

@ggouaillardet
Copy link
Contributor

i previously removed data check (i guess i was too lazy to make it work in a non mpi environment)
so make check currently does not work.

this is fixed in ggouaillardet/ompi#a1235b6bb454e61e4d9134cda0c0288cb713acda

tests/datatype/external32 needs a lot of duplication since MPI_xxx cannot be invoked unless MPI_Init was invoked. i revamped the part to avoid code duplication in ggouaillardet/ompi#69bd838d3b1caa59363c8dba450ebdffd7c22ffa

i will close #1399 from now

note with even with these two commits, branch has conflicts and cannot merge.
i fixed that in #1509

@bosilca bosilca closed this Mar 30, 2016
@bosilca bosilca deleted the topic/external_support branch March 30, 2016 19:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants