Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
zero_outside_bounds
, which sets record waveforms to zero where they do not represent real data. For example, for 100-sample pulse stored in a 110-sample record, it will zero the final ten samples. Not doing this (and instead relying on the DAQ +strax.baseline
to do it) may have contributed to some problems with the data reduction algorithms we saw during the latest DAQ test.filter_records
, which allows linear filtering of waveforms, e.g. to improve signal/noise or sharpen the pulses for better data reduction. It takes account of the fact that pulses are split over multiple records, by applying the convolution multiple times with different filter shifts (for the parts of the record for which this is needed). The convolutions is applied outside of numba, usingscipy.ndimage.convolve1d
, since I was too lazy to code up my own convolution loop + tests. Note at least one old version of scipy has a bug where this segfaults at extreme shifts of the filter (it works fine in v1.2.1, but I had to upgrade).cut_outside_hits
with a somewhat faster version. The function remains without tests for the moment, unfortunately.