You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
Running time of STAMP varies hugely depending on size of data. For some data sizes it even gets stuck without showing the progress bar.
For example computing the distance profile between a dataset and a query of length 34 and window size 13, I observe the following
average of 1 it/s for dataset of length 620579
average of 0.5 it/s for dataset of length 620578
average of 0.3 it/s for dataset of length 620577
average of 2.5 it/s for dataset of length 620576
average of 0.04 it/s for dataset of length 620574
stamp gets stuck without error for some sizes of dataset - or more probably it takes a lot of time to get to the point of showing the progress bar - so much that I do not have patience to wait and want to interrupt it. I must then hard kill Rstudio to interrupt evaluation.
In other words, changing the size of the dataset by just one unit has a big, nonlinear and unintuitive effect on execution time of stamp. For some size of data the function gets stuck without error.
Working examples
I could reproduce this behavior with random data.
The bottleneck is stats::fft(), this may have something to do with the FFT algorithm. Maybe padding the data with zeroes (the right amount) can solve this. Need to look into it.
Description
Running time of STAMP varies hugely depending on size of data. For some data sizes it even gets stuck without showing the progress bar.
For example computing the distance profile between a dataset and a query of length 34 and window size 13, I observe the following
In other words, changing the size of the dataset by just one unit has a big, nonlinear and unintuitive effect on execution time of stamp. For some size of data the function gets stuck without error.
Working examples
I could reproduce this behavior with random data.
Expected behavior
I would expect that running time get shorter as data decreases in size.
The text was updated successfully, but these errors were encountered: