New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] PSD of Ragged Epochs #12315
base: main
Are you sure you want to change the base?
[ENH] PSD of Ragged Epochs #12315
Conversation
To me ideally we would handle different-length See also:
Coming back to the idea of starting from If this approach does potentially appeal to you, could you try first using #12311 with |
Hmmm I read #3533 and see the plethora of issues and maintenance burden that accompany supporting variable duration epochs. I can only really think of one use case for variable epochs where they are comparable--that is some kind of steady-state task like cuing the participant to do a repetitive grasping motion and they actually move for a variable period of time with an FFT-based analaysis. For the analysis, you could limit your data to the shortest period and take a smaller slice out of all the longer duration trials but this seems very wasteful of data since, as I understand it, FFT methods should be slightly noisier for shorter duration trials but generally comparable across trials of different duration (this should be checked in an analysis, clearly). That seems like a clear use-case whereas time-frequency and ERP and everything else I can think of seems like it would fit within the existing epoch framework (i.e. you'd want to crop to equal lengths or interpolate before comparing anyway so that the dimensions match up). So to me, this seems like much more of a one-off kind of thing. I drafted the PR instead of continuing to work on it because it seemed so inelegant and I thought it would be nice to get feedback so thanks for that. I'm -0.5 on more general ragged epoch support though, I think this could be kept small in scope. Does that sound reasonable or is it important to support more general cases? |
There are API/implementation issues with ragged epochs. Here you'll have at least some of those... plus a reimplementation of epoching itself, which will bring its own set of issues. So I'm not sure in the end whether or not the current smaller-scope idea will have a more favorable pain-of-maintenance-to-added-functionality ratio :) |
That's a good point. I agree that re-epoching is a huge pain. Perhaps all of it can be kept at a very high level and each event can be temporarily pulled from the |
Basically you will end up needing to reimplement the logic and operations of |
I think if we're going to support any analyses of variable-length trials/epochs in module code, then we should do so in a way that addresses the general case.
Anecdotally: #3533 describes a use case that I've personally needed, and off the top of my head I can recall at least two separate questions about it on the forum (plus @jona-sassenhagen used to do such analyses IIRC). Namely, stimuli are (variable-length) spoken sentences, and the analysis target is a keyword that occurs (at varying latency) somewhere other than at sentence onset, but the baseline period is pre-sentence-onset. It's not an unusual design in the language-processing world. Even if the approach in #3533 doesn't get you all the way to doing statistics on that data (the temporal alignment won't be right), it lets you do epoching/baselining and apply the inverse, so you can visualize your data much more easily for exploration / sanity checks. |
Ok that all seems very convincing for general variable length epochs support. I can give @larsoner 's suggestion about masked arrays a go, don't seem too daunting. |
I'm going to wait until #12311 is merged to attempt this because I think the natural constructor would be to use the annotations with |
Hmmm so it looks like the upstream libraries (e.g. So I think things that will have to check work:
|
Yes it work’s numerically but I would question how much it works statistically. The noise level then becomes time dependent. Your nave becomes a function of time so the N= you see in plots is misleading. Also see how nave is used to scale the noise covariance in the inverse problem. Basically I would suggest to think beyond an implementation issue here.
|
Hmm good point, probably something better as a discussion than a reply thread |
It would be nice to be able to compute a power spectral density across epochs of different lengths using the
duration
in theannotation
attribute of araw
object. Here is one solution trying to reuse as much code as possible. However, this solution seems a bit inelegant (and I didn't handle rejecting epochs by annotation properly with thedrop_log
and I took the whole raw data array when it would be more memory efficient just to take the slice of interest iftmin
and/ortmax
are provided, again because the solution was inelegant and that would have made it even clunkier). Thoughts? Is this the best way to accomplish this and I should just rework it a bit more or maybe a separate function (e.g.raw.compute_ragged_epochs_spectrum
) would be better?