Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.Sign up
NEST encounters bad_alloc exception when changing tic length #644
The following causes a
Interestingly, a number of examples and tests, e.g.
See also #643.
@heplesser before I create a pull request, can you please have a look at the text below, in particular my remarks at the end, and let me know what you think. Perhaps we can also discuss this in Oslo during the hackathon.
@apeyser many thanks for your hints and the debug session !!!
The problem is the following:
This will set following max limits (LIM_MAX):
During the instantiation of the DelayCheckers, the value of tics of the min_delay_ object is preset with the positive infinity value (not intuitive) by calling Time::pos_inf(). Which means:
DelayChecker::Calibrate destroys this infinity value when TimeConverter is called. The problem is not visible for TICS_PER_STEP >= 1000. A boundary check hides the issue when a new Time-object is instantiated. See the nested macro below: The absolute infinity value contained in t.t won't be smaller than LIM_MAX.ms.
For TICS_PER_STEP < 1000 this condition is not given any longer. In the example above, it will be:
... causing a new tics value of 115292150460684704 after calibration. This is not an infinity value anymore and the if-statment below in ConnectionManager::update_delay_extrema_ which checks against infinity will fail !
Based on the invalid tics value an inappropriate steps value is calculated which in turn is used to resize the moduli_ array during simulation start causing the bad_alloc.
Some general remarks:
In my opinion the TimerConverter belongs to the Time class, I would merge them.
The Time class itself is very hard to read and to debug. E.g.: Nested macros in initializers should be better functions. What looks like constants is sometimes overwritten. Some of the names are ambiguous and not descriptive (min_delay, for example, is used for different things).
Will say, I suggest a small conservative refactoring.
I think the issue is that their exist a large number of time units: steps (int), ms (float), tics (int), delay (naked int)... there should be a universal unit, and all inputs should immediately be converted into it. It should be a very simple class without all the object oriented complexity.
It needs to be transparent and fast -- the previous iteration was eating 50% of simulation time in constructed and handling time objects.
Unfortunately, this is a lot of work.
@gtrensch @apeyser Thank you very much for your detective work! I agree that the best fix for now is to modify the
The reason for initializing with positive/negative infinity is that it makes comparisons easy---otherwise, one would always have to check against invalid values.
@gtrensch Would you create a PR?