-
-
Notifications
You must be signed in to change notification settings - Fork 121
Use a nudge factor to remove repeats #672
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Sometimes simple is better. Instead of using a derivative estimate, now we just check at t + dt * repeat_nudge to avoid repeated zeros, with a default to 1//100 which seems to be a nice balance between numerical stability and keeping most of the interval still an eventful zone. Fixes SciML/DifferentialEquations.jl#758
YingboMa
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's impressive that this works so well.
Co-authored-by: David Widmann <devmotion@users.noreply.github.com>
|
BTW the GPU tiemouts are caused by CUDA 3.3 but fixed on master: https://julialang.slack.com/archives/C689Y34LE/p1623939645188500 AFAIK a new release will be available soon (possibly today). |
|
Awesome, those were confusing me but I at least figured out it was unrelated to this (because those don't test callbacks) |
Sometimes simple is better. Instead of using a derivative estimate, now we just check at t + dt * repeat_nudge to avoid repeated zeros, with a default to 1//100 which seems to be a nice balance between numerical stability and keeping most of the interval still an eventful zone.
Fixes SciML/DifferentialEquations.jl#758
Fixes SciML/DifferentialEquations.jl#647
Fixes SciML/DifferentialEquations.jl#724
Fixes SciML/DifferentialEquations.jl#601
Fixes SciML/DifferentialEquations.jl#642
Fixes #599