New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix to issue #696 #710

Merged
merged 11 commits into from Apr 25, 2017

Conversation

Projects
None yet
5 participants
@kappeld
Contributor

kappeld commented Apr 12, 2017

This is a fix to issue #696. The connection function of the ConnectionManager has been updated to correctly handle MUSIC proxy nodes. A regression test has been added.

@mhoff

This comment has been minimized.

Show comment
Hide comment
@mhoff

mhoff Apr 13, 2017

Contributor

The newly added testcase is apparently failing when ran with make test. However, on my local machine all mpi-related tests fail as I obviously have not properly configured mpirun in .nestrc. Thus I can not really debug what is going on with my testcase.

@heplesser @jougs Could you elaborate on how to properly define mpirun in .nestrc for mpirun (Open MPI) 1.6.5? What did not work is simply uncommenting the existing lines:

/mpirun
[/integertype /stringtype]
[/numproc     /slifile]
{
 () [
  (mpirun -np ) numproc cvs ( ) statusdict/prefix :: (/bin/nest )  slifile
 ] {join} Fold
} Function def

When running make test this yields:

Running  /home/hoff/.local/bin/sli /home/hoff/.local/share/doc/nest/mpi_selftests/fail/crash_distributed_rank_invariant_collect_assert_or_die.sli
SLI v2.12.0 (C) 2004 The NEST Initiative
N_MPI: 1
Apr 12 17:46:43 mpirun [Error]: ArgumentType
    The type of the first parameter did not match the argument(s) of this function.
    Candidates for +mpirun+ are:
    integertype    stringtype     calls <proceduretype>
<end of output>
Test time =   0.12 sec

It might also be useful to make this more transparent for all users (?).

Contributor

mhoff commented Apr 13, 2017

The newly added testcase is apparently failing when ran with make test. However, on my local machine all mpi-related tests fail as I obviously have not properly configured mpirun in .nestrc. Thus I can not really debug what is going on with my testcase.

@heplesser @jougs Could you elaborate on how to properly define mpirun in .nestrc for mpirun (Open MPI) 1.6.5? What did not work is simply uncommenting the existing lines:

/mpirun
[/integertype /stringtype]
[/numproc     /slifile]
{
 () [
  (mpirun -np ) numproc cvs ( ) statusdict/prefix :: (/bin/nest )  slifile
 ] {join} Fold
} Function def

When running make test this yields:

Running  /home/hoff/.local/bin/sli /home/hoff/.local/share/doc/nest/mpi_selftests/fail/crash_distributed_rank_invariant_collect_assert_or_die.sli
SLI v2.12.0 (C) 2004 The NEST Initiative
N_MPI: 1
Apr 12 17:46:43 mpirun [Error]: ArgumentType
    The type of the first parameter did not match the argument(s) of this function.
    Candidates for +mpirun+ are:
    integertype    stringtype     calls <proceduretype>
<end of output>
Test time =   0.12 sec

It might also be useful to make this more transparent for all users (?).

@heplesser

This comment has been minimized.

Show comment
Hide comment
@heplesser

heplesser Apr 18, 2017

Contributor

@mhoff The call signature of the SLI mpirun command has changed. If you have an old .nestrc file in you home directory, it will not be overwritten or updated automatically, you either need to fix it by hand or delete the old .nestc file and the run NEST to install a new one. The mpirun section should look like this (three instead of two arguments):

% The command mpirun tells NEST how to assemble a string, which
% represents a valid command for calling a MPI-enabled executable in a
% distributed fashion The details of this string depend on your MPI
% library and the desired communication interface. Please modify the
% command according to your needs.

/mpirun
[/integertype /stringtype /stringtype]
[/numproc     /executable /scriptfile]
{
 () [
  (mpirun -np ) numproc cvs ( ) executable ( ) scriptfile
 ] {join} Fold
} Function def
Contributor

heplesser commented Apr 18, 2017

@mhoff The call signature of the SLI mpirun command has changed. If you have an old .nestrc file in you home directory, it will not be overwritten or updated automatically, you either need to fix it by hand or delete the old .nestc file and the run NEST to install a new one. The mpirun section should look like this (three instead of two arguments):

% The command mpirun tells NEST how to assemble a string, which
% represents a valid command for calling a MPI-enabled executable in a
% distributed fashion The details of this string depend on your MPI
% library and the desired communication interface. Please modify the
% command according to your needs.

/mpirun
[/integertype /stringtype /stringtype]
[/numproc     /executable /scriptfile]
{
 () [
  (mpirun -np ) numproc cvs ( ) executable ( ) scriptfile
 ] {join} Fold
} Function def

@heplesser heplesser requested review from stinebuu and jougs Apr 18, 2017

@mhoff

This comment has been minimized.

Show comment
Hide comment
@mhoff

mhoff Apr 18, 2017

Contributor

I think I see the problem now. I have used the pymusic interface to write the counterpart of the nest-node. But the testsuite does not support python-based nodes in the MUSIC tests.

We could change this by supporting python-based nodes, e.g.

diff --git a/testsuite/do_tests.sh.in b/testsuite/do_tests.sh.in
index 5c0e2b5..6b8c9f4 100755
--- a/testsuite/do_tests.sh.in
+++ b/testsuite/do_tests.sh.in
@@ -594,6 +594,10 @@ if test "x$(sli -c 'statusdict/have_music :: =')" = xtrue ; then
         sli_files=$(grep '\.sli' ${music_file} | sed -e "s#args=#${TESTDIR}#g")
         sli_files=$(for f in ${sli_files}; do if test -f ${f}; then echo ${f}; fi; done)
 
+        # Collect the list of python files from the .music file.
+        py_files=$(grep '\.py' ${music_file} | sed -e "s#args=#${TESTDIR}#g")
+        py_files=$(for f in ${py_files}; do if test -f ${f}; then echo ${f}; fi; done)
+
         # Check if there is an accompanying shell script for the test.
         sh_file=${TESTDIR}/$(basename ${music_file} .music).sh
         if test ! -f ${sh_file}; then unset sh_file; fi
@@ -608,7 +612,7 @@ if test "x$(sli -c 'statusdict/have_music :: =')" = xtrue ; then
         printf '%s' "  Running test '${test_name}' with $np $proc_txt... "
 
         # Copy everything to the tmpdir.
-        cp ${music_file} ${sh_file} ${sli_files} ${tmpdir}
+        cp ${music_file} ${sh_file} ${sli_files} ${py_files} ${tmpdir}
         cd ${tmpdir}
 
         # Create the runner script

or we can rewrite this test to not use python-based nodes.

Which solution is preferred?

Contributor

mhoff commented Apr 18, 2017

I think I see the problem now. I have used the pymusic interface to write the counterpart of the nest-node. But the testsuite does not support python-based nodes in the MUSIC tests.

We could change this by supporting python-based nodes, e.g.

diff --git a/testsuite/do_tests.sh.in b/testsuite/do_tests.sh.in
index 5c0e2b5..6b8c9f4 100755
--- a/testsuite/do_tests.sh.in
+++ b/testsuite/do_tests.sh.in
@@ -594,6 +594,10 @@ if test "x$(sli -c 'statusdict/have_music :: =')" = xtrue ; then
         sli_files=$(grep '\.sli' ${music_file} | sed -e "s#args=#${TESTDIR}#g")
         sli_files=$(for f in ${sli_files}; do if test -f ${f}; then echo ${f}; fi; done)
 
+        # Collect the list of python files from the .music file.
+        py_files=$(grep '\.py' ${music_file} | sed -e "s#args=#${TESTDIR}#g")
+        py_files=$(for f in ${py_files}; do if test -f ${f}; then echo ${f}; fi; done)
+
         # Check if there is an accompanying shell script for the test.
         sh_file=${TESTDIR}/$(basename ${music_file} .music).sh
         if test ! -f ${sh_file}; then unset sh_file; fi
@@ -608,7 +612,7 @@ if test "x$(sli -c 'statusdict/have_music :: =')" = xtrue ; then
         printf '%s' "  Running test '${test_name}' with $np $proc_txt... "
 
         # Copy everything to the tmpdir.
-        cp ${music_file} ${sh_file} ${sli_files} ${tmpdir}
+        cp ${music_file} ${sh_file} ${sli_files} ${py_files} ${tmpdir}
         cd ${tmpdir}
 
         # Create the runner script

or we can rewrite this test to not use python-based nodes.

Which solution is preferred?

@jougs

This comment has been minimized.

Show comment
Hide comment
@jougs

jougs Apr 18, 2017

Contributor

@mhoff: I would prefer if the test was written in pure SLI as this guarantees that the tests also run in situations in which Python is not available. Feel free to ask if you require assistance with this.

I'm currently out of office, so please don't expect an immediate response. I'll do my best to still answer quickly, though.

Contributor

jougs commented Apr 18, 2017

@mhoff: I would prefer if the test was written in pure SLI as this guarantees that the tests also run in situations in which Python is not available. Feel free to ask if you require assistance with this.

I'm currently out of office, so please don't expect an immediate response. I'll do my best to still answer quickly, though.

@mhoff

This comment has been minimized.

Show comment
Hide comment
@mhoff

mhoff Apr 19, 2017

Contributor

I understand.

The test is functional now. I reduced the receiver to a very simple sli-based node. The functionality is now reduced to setting up the connection and sending some spikes. The only assertion is that the program does not crash. This should suffice as a primitive regression test for the mapping problem.

Contributor

mhoff commented Apr 19, 2017

I understand.

The test is functional now. I reduced the receiver to a very simple sli-based node. The functionality is now reduced to setting up the connection and sending some spikes. The only assertion is that the program does not crash. This should suffice as a primitive regression test for the mapping problem.

@stinebuu

This looks good too me, I just have a request for a model change.

@stinebuu

Excellent, thanks!

@mhoff

This comment has been minimized.

Show comment
Hide comment
@mhoff

mhoff Apr 21, 2017

Contributor

@jougs Are you satisfied with the changes?

Contributor

mhoff commented Apr 21, 2017

@jougs Are you satisfied with the changes?

@jougs

jougs approved these changes Apr 25, 2017

Yes, this looks very good! Many thanks again.

@jougs

This comment has been minimized.

Show comment
Hide comment
@jougs

jougs Apr 25, 2017

Contributor

@heplesser: can you please pull master into your subnet-removal branch and remove subnet according to current standards once this is merged? I'll pull from there then. Thanks!

Contributor

jougs commented Apr 25, 2017

@heplesser: can you please pull master into your subnet-removal branch and remove subnet according to current standards once this is merged? I'll pull from there then. Thanks!

@jougs jougs merged commit b8ad1a5 into nest:master Apr 25, 2017

1 check passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment