Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pulse processing upgrades #154

Merged
merged 13 commits into from May 3, 2019

Conversation

JelleAalbers
Copy link
Member

  • Add zero_outside_bounds, which sets record waveforms to zero where they do not represent real data. For example, for 100-sample pulse stored in a 110-sample record, it will zero the final ten samples. Not doing this (and instead relying on the DAQ + strax.baseline to do it) may have contributed to some problems with the data reduction algorithms we saw during the latest DAQ test.
  • Add filter_records, which allows linear filtering of waveforms, e.g. to improve signal/noise or sharpen the pulses for better data reduction. It takes account of the fact that pulses are split over multiple records, by applying the convolution multiple times with different filter shifts (for the parts of the record for which this is needed). The convolutions is applied outside of numba, using scipy.ndimage.convolve1d, since I was too lazy to code up my own convolution loop + tests. Note at least one old version of scipy has a bug where this segfaults at extreme shifts of the filter (it works fine in v1.2.1, but I had to upgrade).
  • Replace cut_outside_hits with a somewhat faster version. The function remains without tests for the moment, unfortunately.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant