Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ROSS doesn't clearly indicate when g_tw_ts_end has been hit #8

Closed
JohnPJenkins opened this issue Oct 29, 2014 · 8 comments
Closed

ROSS doesn't clearly indicate when g_tw_ts_end has been hit #8

JohnPJenkins opened this issue Oct 29, 2014 · 8 comments

Comments

@JohnPJenkins
Copy link

The g_tw_ts_end variable sets a cap on the global timestamp of the simulation, at which point the simulation will stop running.

There is no clear indication in the output that this has happened however (or I am overlooking it if it is there). This causes confusion if the user expects the simulation to normally end by completing all pending events but it accidentally hits g_tw_ts_end instead.

@JohnPJenkins
Copy link
Author

As an interim solution in CODES, we've defined our own codes_event_new() wrapper that asserts that the absolute time of the event is < g_tw_ts_end. This isn't appropriate for ROSS since many models deliberately queue events after g_tw_ts_end; we would probably want configurable behavior there (either do nothing, print a warning, or assert).

@carns
Copy link
Contributor

carns commented Oct 29, 2014

Alternatively, model writers who expect a simulation to run to g_tw_ts_end might be confused if ROSS stops because it ran out of events to process.

The minimal solution might be to just have ROSS display which stopping condition it used as part of its normal output. If it were also indicated as an output argument to tw_run() or tw_end() then model writers could add asserts in main if they wanted to make certain it ended one way or the other.

@gonsie
Copy link
Member

gonsie commented Oct 30, 2014

ROSS jumps to g_tw_ts_end when there are no more events to process due to an optimization (ROSS GVTs jump to global minimum time, thus eliminating small, uneventful, time windows). If there are no more events in the system... there can never be any more events in the system.

Would adding clearer documentation surrounding g_tw_ts_end be helpful here?

@JohnPJenkins
Copy link
Author

Clearer documentation is always helpful :).

Is it possible for ROSS to determine whether or not events exist past g_tw_ts_end? This knowledge is sufficient for our needs - if an event exceeds g_tw_ts_end, then we know the stopping condition of the simulation, likewise with no events exceeding g_tw_ts_end.

@gonsie
Copy link
Member

gonsie commented Oct 30, 2014

ROSS does not schedule events with timestamps beyond g_tw_ts_end (optimization!). Would a warning at this point be useful? It will not show up when the simulation ends, rather, when the event with the bad timestamp is sent.

@JohnPJenkins
Copy link
Author

I think a warning is not the right way to thinking about it, as we could have steady state simulations that we want to cut off after a certain time (the protocol simulations Shane has been working on is an example of this). Rather, we simply want to know whether the g_tw_ts_end threshold was passed by an event or not. Preferably this would be printed along with the other summary info ROSS prints out.

@gonsie
Copy link
Member

gonsie commented Jan 23, 2015

I am working on this now.

Would the statistic "total events scheduled past end time" be useful? When running in optimistic mode, this number may become somewhat inflated (due to rolled back events sending events past the end time).

@JohnPJenkins
Copy link
Author

For my purposes, a binary yes / no would be sufficient, but I understand that the concept may be loose due to rollbacks, so a count is fine.

@gonsie gonsie closed this as completed in 108b6ad Jan 23, 2015
@gonsie gonsie reopened this Jan 23, 2015
@gonsie gonsie closed this as completed Jan 23, 2015
laprej added a commit that referenced this issue Aug 31, 2015
laprej added a commit that referenced this issue Sep 1, 2015
laprej added a commit that referenced this issue Sep 1, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants