Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add noise scaling tests. #232

Merged
merged 5 commits into from
Jul 28, 2020
Merged

Add noise scaling tests. #232

merged 5 commits into from
Jul 28, 2020

Conversation

TorkelE
Copy link
Member

@TorkelE TorkelE commented Jul 23, 2020

At the risk of overdoing the tests...

This basically tests the noise scaling functionality recently added in MTK, but "in action". That is, it simulates a network with different levels of noise scaling and tests that the variation of the distributions scales properly.

@TorkelE
Copy link
Member Author

TorkelE commented Jul 23, 2020

On a side note, I have been trying to figure out why there's occasional crashes in the tests. I have been rerunning the tests in a long for loop until Julia suddenly just stops.

My hypothesis have been that there's some obscure case in the SDE tests where the solution goes into negative and causes an error (this should have been accounted for when the test was constructed). However, when I run the SDE tests just a lot of time, and eventually get a crash, and investigate the parameters which yielded the crash, it is fine. When I got a crash in runtests.jl it was also kinda random. Still not sure what is happening.

@ChrisRackauckas
Copy link
Member

https://travis-ci.com/github/SciML/Catalyst.jl/builds/176849991#L526

Is that the "crash" you're talking about? Error reports should be more specific, because just saying crash could mean anything. If it's this, then yeah something went negative and the next MTK release should handle that.

@TorkelE
Copy link
Member Author

TorkelE commented Jul 23, 2020

Ok, that travis report is more helpful, yes, that was what I guessed went wrong.

The reason I used the very ambiguous "crash" was that when I wrapped the runtests files, or the solve_SDE tests file, in a for loop, then after a while Juno had simply exited mid-run. There was no error message or anything similar to give any hint what had happened. But obviously, something had gone wrong.

@isaacsas
Copy link
Member

It might be good to pare down the shear number of networks we test some though. Test really take a long time currently, with some test files allocating > 50GB of memory.

@ChrisRackauckas
Copy link
Member

We can split tests too. But yes, using one network to test many features is usually a good idea unless we're testing network generation itself.

@isaacsas
Copy link
Member

Or maybe there is a way to setup a standard vs full set of tests, where standard is what runs normally and is more compact, but we can occasionally trigger full to run more networks.

@ChrisRackauckas
Copy link
Member

If we split test groups, then we could do it like OrdinaryDiffEq where the full tests are only ran on CI and a subset are ran locally.

@TorkelE
Copy link
Member Author

TorkelE commented Jul 23, 2020

Trying to reduce the rather excessive number of tests would be good. They were quite useful when written since they caught a few things. But at this stage, they mostly extend the test time. A shame to remove them though might come in useful.

@TorkelE
Copy link
Member Author

TorkelE commented Jul 23, 2020

At a closer look, it turned out that this error was actually in the new tests that I added, and that these had a small chance to go into negative. I have modified to ensure that the chance of this happening is practically zero.

@TorkelE TorkelE closed this Jul 25, 2020
@TorkelE TorkelE reopened this Jul 25, 2020
@TorkelE TorkelE closed this Jul 25, 2020
@TorkelE TorkelE reopened this Jul 25, 2020
@TorkelE
Copy link
Member Author

TorkelE commented Jul 28, 2020

Ok, I think I might finally be picking up a trail on what is going on here. Hopefully, I will have it nailed down soon.

@TorkelE
Copy link
Member Author

TorkelE commented Jul 28, 2020

Ok, I think I have narrowed it down to something odd going on for certain SDE solvers SciML/StochasticDiffEq.jl#344

@isaacsas
Copy link
Member

@TorkelE Looks like tests have now passed. OK to merge?

@ChrisRackauckas
Copy link
Member

yeah it should be good.

@ChrisRackauckas ChrisRackauckas merged commit d2683ce into master Jul 28, 2020
@ChrisRackauckas ChrisRackauckas deleted the noise_scaling_test branch July 28, 2020 23:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants