Precision setting improvement #446

Merged
merged 8 commits into from Sep 28, 2016

Conversation

Projects
None yet
4 participants
@sepehrmn
Contributor

sepehrmn commented Aug 3, 2016

If there are precise models and a user sets "precise_times" to true, everything goes fine. But if there are precise models and it is not set, there can be a few problems:

1st problem: The precision value set by the user is ignored and it is set to 15. NEST does give the warning message below and it prints the value set, but it does not directly point to why the specified "precision" value was ignored and how one can set it to other than 15. This pull request makes it so that the value set for precision is never ignored.

Precise neuron models exist: the property precise_times of the
spike_detector with gid 2 has been set to true, precision has been set to
15.

2nd problem: The default precision is 3 on recording_device, but if precise models exist and "precise times" is not set, spike_detector deviates from this and sets it to 15. It makes sense to use a higher precision for precise models as pointed out in #431 , so this pull request makes sure that unless a precision is specified by the user, it is set to 3 when there are no precise models and 15 when there are. Currently, it can use 3 even if there are precise models and this is not consistent.

@sepehrmn

This comment has been minimized.

Show comment
Hide comment
@sepehrmn

sepehrmn Aug 3, 2016

Contributor

I know why it fails and will do a fix; probably by changing what "user_set_precise_times" stores. Please comment if you read the changes and agree/disagree.

Contributor

sepehrmn commented Aug 3, 2016

I know why it fails and will do a fix; probably by changing what "user_set_precise_times" stores. Please comment if you read the changes and agree/disagree.

models/spike_detector.h
@@ -183,6 +183,7 @@ class spike_detector : public Node
Buffers_ B_;
bool user_set_precise_times_;
+ long user_set_precision_value;

This comment has been minimized.

@heplesser

heplesser Aug 3, 2016

Contributor

user_set_precision_value is a private member. Therefore, the name should end in a _.

@heplesser

heplesser Aug 3, 2016

Contributor

user_set_precision_value is a private member. Therefore, the name should end in a _.

models/spike_detector.cpp
+
+ device_.set_precise( true, precision );
+
+ if ( user_set_precise_times_ != true )

This comment has been minimized.

@heplesser

heplesser Aug 3, 2016

Contributor

One should not compare booleans to true or false, just if ( not user_set_precise_times_ ).

@heplesser

heplesser Aug 3, 2016

Contributor

One should not compare booleans to true or false, just if ( not user_set_precise_times_ ).

@heplesser

This comment has been minimized.

Show comment
Hide comment
@heplesser

heplesser Aug 3, 2016

Contributor

@sepehrmn I understand your goal now. Your code is mostly fine, but I would still think to structure it somewhat differently, so that almost all responsibility is placed in the RecordingDevice class that handles precision.

First, I would add in RecordingDevice a constant member default_precision_ with value 3 and use that to initialize the precision_ member. Then, I would add to RecordingDevice a method

bool uses_default_precision() const { return P_.precision == default_precision_; }

In spike_detector, you can then do something like

if ( device_.uses_default_precision() )
{
  device_.set_precision( true, 15 );
}

This keeps responsibilities clearer between classes.

Contributor

heplesser commented Aug 3, 2016

@sepehrmn I understand your goal now. Your code is mostly fine, but I would still think to structure it somewhat differently, so that almost all responsibility is placed in the RecordingDevice class that handles precision.

First, I would add in RecordingDevice a constant member default_precision_ with value 3 and use that to initialize the precision_ member. Then, I would add to RecordingDevice a method

bool uses_default_precision() const { return P_.precision == default_precision_; }

In spike_detector, you can then do something like

if ( device_.uses_default_precision() )
{
  device_.set_precision( true, 15 );
}

This keeps responsibilities clearer between classes.

@sepehrmn

This comment has been minimized.

Show comment
Hide comment
@sepehrmn

sepehrmn Aug 4, 2016

Contributor

Thanks @heplesser !

There is a small problem with using

bool uses_default_precision() const { return P_.precision == default_precision_;

If "precise_times" is not set by the user but precision is explicitly set to 3 (which is also the default value), without adding another variable to the spike_detector, for this special case it would be impossible to correctly set it to the user specified value (3). Adding another variable defeats the purpose of moving the precision setting to the recording_device.

i have a new solution. It works but I will double check and Clang-format it later.

Contributor

sepehrmn commented Aug 4, 2016

Thanks @heplesser !

There is a small problem with using

bool uses_default_precision() const { return P_.precision == default_precision_;

If "precise_times" is not set by the user but precision is explicitly set to 3 (which is also the default value), without adding another variable to the spike_detector, for this special case it would be impossible to correctly set it to the user specified value (3). Adding another variable defeats the purpose of moving the precision setting to the recording_device.

i have a new solution. It works but I will double check and Clang-format it later.

models/spike_detector.cpp
+
+ else
+ {
+ device_.set_precise(true, device_.get_precision());

This comment has been minimized.

@heplesser

heplesser Aug 5, 2016

Contributor

Does this operation do anything? It just fetches the precision from the device and sets it again.

@heplesser

heplesser Aug 5, 2016

Contributor

Does this operation do anything? It just fetches the precision from the device and sets it again.

This comment has been minimized.

@sepehrmn

sepehrmn Aug 5, 2016

Contributor

Yes, this is a bit confusing. I want to change set_precise method later and add two separate methods for changing precision and precise_times in a later version. If I remove this line, if precise models exist and user has not set "precise_times", it would not be set to true.

@sepehrmn

sepehrmn Aug 5, 2016

Contributor

Yes, this is a bit confusing. I want to change set_precise method later and add two separate methods for changing precision and precise_times in a later version. If I remove this line, if precise models exist and user has not set "precise_times", it would not be set to true.

models/spike_detector.cpp
+ get_name(),
+ get_gid() ) );
+ }
+

This comment has been minimized.

@heplesser

heplesser Aug 5, 2016

Contributor

Remove this empty line.

@heplesser

heplesser Aug 5, 2016

Contributor

Remove this empty line.

@heplesser

This comment has been minimized.

Show comment
Hide comment
@heplesser

heplesser Aug 5, 2016

Contributor

@sepehrmn Thanks for noticing my oversight, I had not considered the case that the user explicitly sets the precision to the default value. The separation of responsibilities between is now fine.

But I think the case distinctions in spike_detector::calibrate() are now too complicated, in particular the final else if clause. If the network contains precise-spiking neurons, i.e., kernel().event_delivery_manager.get_off_grid_communication() returns true, then we must record precise times (this is to protect the user from throwing away spike time information by accident). Therefore, the only question is which precision to use, so

if ( kernel().event_delivery_manager.get_off_grid_communication() 
     and not device_.is_precise_times_user_set() )
{
  const long precision = device_.is_precision_user_set() ? device_.get_precision : 15;
  device_.set_precision(true, precision);

  std::string msg  = String::compose(
                             "Precise neuron models exist: the property precise_times "
                             "of the %1 with gid %2 has been set to true",
                             get_name(),
                             get_gid() ) );
  if ( device_.is_precision_user_set() )
  {
    msg += ".";
  }
  else
  {
    msg += String::compose(", precision has been set to %1", precision);
  }
  LOG( M_INFO, "spike_detector::calibrate", msg);
}

should suffice; it also avoids some code duplication.

Contributor

heplesser commented Aug 5, 2016

@sepehrmn Thanks for noticing my oversight, I had not considered the case that the user explicitly sets the precision to the default value. The separation of responsibilities between is now fine.

But I think the case distinctions in spike_detector::calibrate() are now too complicated, in particular the final else if clause. If the network contains precise-spiking neurons, i.e., kernel().event_delivery_manager.get_off_grid_communication() returns true, then we must record precise times (this is to protect the user from throwing away spike time information by accident). Therefore, the only question is which precision to use, so

if ( kernel().event_delivery_manager.get_off_grid_communication() 
     and not device_.is_precise_times_user_set() )
{
  const long precision = device_.is_precision_user_set() ? device_.get_precision : 15;
  device_.set_precision(true, precision);

  std::string msg  = String::compose(
                             "Precise neuron models exist: the property precise_times "
                             "of the %1 with gid %2 has been set to true",
                             get_name(),
                             get_gid() ) );
  if ( device_.is_precision_user_set() )
  {
    msg += ".";
  }
  else
  {
    msg += String::compose(", precision has been set to %1", precision);
  }
  LOG( M_INFO, "spike_detector::calibrate", msg);
}

should suffice; it also avoids some code duplication.

@heplesser

This comment has been minimized.

Show comment
Hide comment
@heplesser

heplesser Aug 10, 2016

Contributor

@sepehrmn It seems the checks have failed due to code-formatting problems. Could you fix those?

Contributor

heplesser commented Aug 10, 2016

@sepehrmn It seems the checks have failed due to code-formatting problems. Could you fix those?

@heplesser

This comment has been minimized.

Show comment
Hide comment
@heplesser

heplesser Aug 11, 2016

Contributor

@sepehrmn Good work! Now we just need a test, which would also illustrate the different cases you want to cover (users setting/not setting precise / precision before / after simulating). Ideally, the test should be written in SLI, but since it involves checking what is written to spike-detector output files, it might be easier to write the test in Python.

Contributor

heplesser commented Aug 11, 2016

@sepehrmn Good work! Now we just need a test, which would also illustrate the different cases you want to cover (users setting/not setting precise / precision before / after simulating). Ideally, the test should be written in SLI, but since it involves checking what is written to spike-detector output files, it might be easier to write the test in Python.

@sepehrmn

This comment has been minimized.

Show comment
Hide comment
@sepehrmn

sepehrmn Aug 13, 2016

Contributor

@heplesser Many thanks! I added more tests to the existing SLI script to check the value of precision as this implicitly tells us the number of floating points written to file. Please let me know if this suffices.

Contributor

sepehrmn commented Aug 13, 2016

@heplesser Many thanks! I added more tests to the existing SLI script to check the value of precision as this implicitly tells us the number of floating points written to file. Please let me know if this suffices.

@jougs

This comment has been minimized.

Show comment
Hide comment
@jougs

jougs Aug 14, 2016

Contributor

Tests are failing due to travis-ci/packer-templates#221. Please don't merge until that is resolved.

I will to check how to incorporate this changes into the upcoming nestio framework.

Contributor

jougs commented Aug 14, 2016

Tests are failing due to travis-ci/packer-templates#221. Please don't merge until that is resolved.

I will to check how to incorporate this changes into the upcoming nestio framework.

@heplesser

This comment has been minimized.

Show comment
Hide comment
@heplesser

heplesser Aug 15, 2016

Contributor

@sepehrmn The test looks fine. 👍 subject to all tests passing (when the Travis MPI problems will be solved).

Contributor

heplesser commented Aug 15, 2016

@sepehrmn The test looks fine. 👍 subject to all tests passing (when the Travis MPI problems will be solved).

@heplesser

This comment has been minimized.

Show comment
Hide comment
@heplesser

heplesser Aug 25, 2016

Contributor

@jougs Could you be the second reviewer for this PR since you anyways are looking at it in connection to NEST IO?

Contributor

heplesser commented Aug 25, 2016

@jougs Could you be the second reviewer for this PR since you anyways are looking at it in connection to NEST IO?

@gtrensch

This comment has been minimized.

Show comment
Hide comment
@gtrensch

gtrensch Sep 27, 2016

Contributor

Two minor notes:

  • I tested the changes in a MUSIC environment. This caused unit test "test_neurons_handle_multiplicity.sli" to fail. This was not detected by the Travis CI build because the branch does not include the MUSIC test suite yet. The failing unit test is not related to that pull request and the problem has already been fixed on master. But to avoid unforeseeable effects merging with master is recommended.
  • If I am not mistaken then a double value can be meaningfully represented with 15 significant digits. Thus, a value greater than 15 is ignored by std::setprecision. The unit test uses 17. This is fine for the unit test to check whether the value is stored correctly. Technically it is not meaningful and could be misinterpreted.

Despite my notes it looks also fine to me. 👍

Contributor

gtrensch commented Sep 27, 2016

Two minor notes:

  • I tested the changes in a MUSIC environment. This caused unit test "test_neurons_handle_multiplicity.sli" to fail. This was not detected by the Travis CI build because the branch does not include the MUSIC test suite yet. The failing unit test is not related to that pull request and the problem has already been fixed on master. But to avoid unforeseeable effects merging with master is recommended.
  • If I am not mistaken then a double value can be meaningfully represented with 15 significant digits. Thus, a value greater than 15 is ignored by std::setprecision. The unit test uses 17. This is fine for the unit test to check whether the value is stored correctly. Technically it is not meaningful and could be misinterpreted.

Despite my notes it looks also fine to me. 👍

@sepehrmn

This comment has been minimized.

Show comment
Hide comment
@sepehrmn

sepehrmn Sep 28, 2016

Contributor

Thanks! Done.

Contributor

sepehrmn commented Sep 28, 2016

Thanks! Done.

@heplesser heplesser merged commit 4b0f360 into nest:master Sep 28, 2016

1 check passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details

sepehrmn added a commit to sepehrmn/nest-simulator that referenced this pull request Sep 28, 2016

sepehrmn added a commit to sepehrmn/nest-simulator that referenced this pull request Sep 28, 2016

sepehrmn added a commit to sepehrmn/nest-simulator that referenced this pull request Sep 28, 2016

sepehrmn added a commit to sepehrmn/nest-simulator that referenced this pull request Sep 29, 2016

sepehrmn added a commit to sepehrmn/nest-simulator that referenced this pull request Sep 29, 2016

@sepehrmn sepehrmn deleted the sepehrmn:precision_setting_improvement branch Nov 16, 2016

@sepehrmn sepehrmn referenced this pull request Mar 31, 2017

Closed

Spin_detector problems #698

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment