New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] - Update standard for compute n_samples for simulations #250
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested this out and it this fixes the number of sample like you described. However, it seems to increased cycle edge artifacts. Running this pre/post this PR:
import numpy as np
from neurodsp.sim import sim_oscillation
from neurodsp.plts import plot_time_series
n_seconds = 1
fs = 1000
freq = 30
sig = sim_oscillation(n_seconds, fs, freq)
times = np.arange(0, len(sig)/fs, 1/fs)
plot_time_series(times, sig, xlim=(0, .1))
Ahhh. Good catch! These damn simulations - there is always something. It turns out, the bugs here were introduced back the first time I updated the whole sampling thing - I tried to reduce and removed some important stuff (ooops). In particular, a previous update to In order to try and avoid going around in circles any more on these kinds of concatenation issues, I've added some tests that should (in at least some cases) detect concatenation issues. This PR is now updated - to consolidate @ryanhammonds - can you stress test this again - run a series of various oscillation sims and make sure there are no remaining concatenation issues and/or length discrepancies. Please check different cycle types too (I've mostly done quick checks with sine waves). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, thanks for the review and work comparing these things here @ryanhammonds! I made a small tweak on sim settings to coordinate a little with #255, but it's super-straight-forward, so I'm going to merge this in now! |
This is an update to the convention for calculating the expected number of samples for a simulation, updating to round up, rather than down.
This, in part, reverts a prior update that tried to consolidate on rounding down:
1439d48#diff-0c31582faa78b9253161cdc956a8ac5ba1da385c1411116af3c7861315e56b7c
The original reason for the update was that sometimes the simulated signal, and simulated time vector could end up different lengths (since we weren't super consistent on computing the expected number of cycles). When consolidating, I set to round down. However, that had an unexpected (and uncaught by the prior tests) that some sim'd signals could be shorter than expected.
For example, on main right now, this would be expected to create an array of 1000 samples:
However, since there is an uneven number of samples for the frequency, rounding down and concatenating 7 cycles leads to a signal length of 994 samples, which is shorter than expected (which creates issues for concatenation, etc).
We used to generally round up, so this is basically just stepping back to that - and how we should be consistent of simulating the same number of samples all the time.
This PR also now updates and extends our testing, to make sure simulations are proceeding as expected, and fixes a bug in
create_times
that led to the wrong time definition if using thestart_val
input.