Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MRG, ENH: Add NIRSport support #9348

Merged
merged 34 commits into from May 10, 2021
Merged

MRG, ENH: Add NIRSport support #9348

merged 34 commits into from May 10, 2021

Conversation

rob-luke
Copy link
Member

@rob-luke rob-luke commented Apr 24, 2021

I am trying to determine how to use the GitHub magic to do some work on #7936. I tried pushing to uge-lescot#1 but made a mess of it. Now I am trying this.

Ok, this hasn't added to the existing PR. Instead it has opened a new one. I think this is a bad idea because I would like the original authors to get the internet points associated with merging the code.

How am I able to work on Swy7ch's #7936 PR? I just wish to get all the tests green for them.

David JULIEN and others added 8 commits December 9, 2020 15:29
When the probes get saturated (i.e. when the value they return is "too
high"), NIRStar replaces the value with 'NaN' in the standard .wlX file.
The true value is kept in a .nosatflags_wlX file, which is a copy of a
.wlX file with the true probed values.

It is unclear how NIRstar decides that a probe got satured, so we added
a flag to `read_raw_nirx()` and `RawNIRX.__init__` to let the user chose
which file to use. Providing two *.wl1 or two *.wl2 files still raises
an error, as we check for the existence and unicity of the corresponding
*.nosatflags_wlX file.

Note that the directory structure check doesn't yet discriminate between
*.wlX and *.nosatflags_wlX (as it checks for any file ending with 'wlX').
Changing the check could be done in another PR to make sure the user did
not provide only *.nosatflags_wlX files and maybe warn them.
Some devices register 'NaN' values when the probes saturate. Since 'NaN'
values can cause unexpected behaviour, we added the ability to annotate
them so that they can be filtered out during the process.
@rob-luke
Copy link
Member Author

rob-luke commented Apr 24, 2021

@rderollepot thanks for your support with this PR (and I apologise for jumping between branches, hopefully someone better at git can merge this back in to your PR). Im going to try and explain my understanding below and see if we can debug my issue. I will explain things you are already aware of, so that others (@larsoner) can also follow along.

I have added some tests for the files which you contributed and diligently documented in mne-tools/mne-testing-data#78

However, my tests are failing. When I read the data it appears to not contain any NaNs in the channel combinations specified in the montage. I have confirmed that your code does read the correct wave file based on the saturation argument 👍 . And I have confirmed that NaNs exist in the wl1 file 👍 and not in the nosatflag file 👍 .

The data on disk (specifically the files nirx_15_3_recording_w_full_saturation and nirx_15_3_recording_w_occasional_saturation) contains the measurements between all source and detector pairs. However, only certain pairs are valid, and these are specified in your montage. When I load the raw wl1 data file with all combinations of channels it has NaNs 👍 but when the code selects/subsets just the channels (optode pairings of interest) in your montage, all those channels are NaN free 👎 . Based on your great notes in mne-tools/mne-testing-data#78 I expected that at least the data for detector 2 would contain NaNs. Is this your understanding too?

Is it possible that in the test data the channels in your montage never saturated? I dont know the software/hardwware you used, so Im not sure if a big red light flashes, or if you just have to guess. I'd really like to ensure there is a test that returns Nans so we can ensure the saturation flag works. To make things move quicker, would you like to email me another file (doesn't have to be a test file like before, just anything you have where you are certain it saturates) so I can confirm the behaviour in this PR. Im totally open to other suggestions too. Or let me know if I misunderstand the data, I haven't used this device myself.

mne/preprocessing/annotate_nan.py Outdated Show resolved Hide resolved
mne/preprocessing/annotate_nan.py Outdated Show resolved Hide resolved
mne/preprocessing/annotate_nan.py Outdated Show resolved Hide resolved
mne/preprocessing/annotate_nan.py Outdated Show resolved Hide resolved
mne/io/nirx/nirx.py Outdated Show resolved Hide resolved
mne/io/nirx/nirx.py Outdated Show resolved Hide resolved
mne/io/nirx/nirx.py Outdated Show resolved Hide resolved
@larsoner
Copy link
Member

@rob-luke it looks like we're close (today or tomorrow maybe) to pushing out 0.23. Do you think we should wait for this, or merge this early in the 0.24 cycle?

@rob-luke
Copy link
Member Author

Your suggestions are small, so that wouldn't take long. But what I am unsure about is how to test all these options for dealing with saturated sensors. Currently the test data doesn't enable us to test the different options, that makes me nervous.

However, I just got off a phone call with NIRx and they said that the sensors very rarely saturate, they said they might go a year without seeing saturation. But is that your experience @rderollepot? Or do you see this more often? I will email NIRx to ask about this specifically.

I'd love to get this in for the release, but unless we get some way to test it in the coming days, then I guess we need to push to 0.24. Im not sure 😰

@rob-luke
Copy link
Member Author

However, I just got off a phone call with NIRx and they said that the sensors very rarely saturate, they said they might go a year without seeing saturation. But is that your experience @rderollepot? Or do you see this more often? I will email NIRx

They just responded. Apparently saturation was more common in nirsport1. So we should be sure to ensure we handle it correctly.

@rderollepot
Copy link
Contributor

First of all, thanks a lot @rob-luke and @larsoner for your work on this PR !

However, my tests are failing. When I read the data it appears to not contain any NaNs in the channel combinations specified in the montage. I have confirmed that your code does read the correct wave file based on the saturation argument 👍 . And I have confirmed that NaNs exist in the wl1 file 👍 and not in the nosatflag file 👍 .

The data on disk (specifically the files nirx_15_3_recording_w_full_saturation and nirx_15_3_recording_w_occasional_saturation) contains the measurements between all source and detector pairs. However, only certain pairs are valid, and these are specified in your montage. When I load the raw wl1 data file with all combinations of channels it has NaNs 👍 but when the code selects/subsets just the channels (optode pairings of interest) in your montage, all those channels are NaN free 👎 . Based on your great notes in mne-tools/mne-testing-data#78 I expected that at least the data for detector 2 would contain NaNs. Is this your understanding too?

Yes, that is what I would have expected too 👍

Is it possible that in the test data the channels in your montage never saturated?

I thought it did, but while reading your very clear analysis I realized there was a mistake in the process I followed to record those test datasets. To verify that I could trigger some saturations (which I was unsure), I just checked for NaNs presence in the .wl1/.wl2 files as you did, but I didn't verify that those channels were indeed in my montage as you pointed out, therefore not ensuring that those saturations are problematic. Sorry for that, I'll plan to record new test datasets if those cannot be used.

To make things move quicker, would you like to email me another file (doesn't have to be a test file like before, just anything you have where you are certain it saturates) so I can confirm the behaviour in this PR.

I definitely have experimental recordings with saturated channels, I'll look for the cleaner one I can find and email it to you quickly.

However, I just got off a phone call with NIRx and they said that the sensors very rarely saturate, they said they might go a year without seeing saturation. But is that your experience @rderollepot? Or do you see this more often? I will email NIRx to ask about this specifically.

We have two NIRx fNIRS in the lab: the first NIRSport, and the NIRScout. We rarely end up with saturations (if any ?) using the NIRScout, but we often have some using the NIRSport v1, which confirms the answer you just got from NIRx.

@rob-luke
Copy link
Member Author

Yes, that is what I would have expected too

Great, I'm glad we are all of the same understanding.

I'll plan to record new test datasets if those cannot be used

If possible this would be great. But I am aware that getting time in the lab can be tough. So let me know if not possible. Perhaps I’m being too pedantic with tests?

We would only need to replace one measurement. Then we would have 1) a recording with no saturation (already have) 2) a recording with saturation, but not on montage channels (already have) 3) recording that has saturation in the channel of interest (new measurement, could just overwrite one that you previously committed). 3 would allow us to test the new code features.

Thoughts?

@larsoner
Copy link
Member

Instead of recording new data, could you modify the existing montage definition (can be on the fly in the code in memory with monkeypatch for example) to have a saturated source-detector pair so that the nan get read from the wl1/wl2 files?

@rob-luke
Copy link
Member Author

Both @rderollepot and NIRx have sent me data to test with locally. If these larger files operate correctly locally then I'd be more confident with a testing hack as you describe. I'll report back here in a week on my progress.

@larsoner
Copy link
Member

larsoner commented May 3, 2021

I took a stab at it @rob-luke, see if it makes sense to you

dur -= on
onset.extend(on)
duration.extend(dur)
description.extend(['BAD_NAN'] * len(on))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would this be more specific? Maybe helpful if there are other causes of NaN in other data types M/EEG etc.

Suggested change
description.extend(['BAD_NAN'] * len(on))
description.extend(['BAD_SATURATION'] * len(on))

@rob-luke
Copy link
Member Author

rob-luke commented May 3, 2021

Code looks good. I am just testing the behaviour locally against some additional private files that rderollepot kindly shared.

@rob-luke
Copy link
Member Author

rob-luke commented May 3, 2021

Behaviour looks good to me locally. Here is an example

image

Thanks @larsoner for doing the heavy lifting.

@rob-luke
Copy link
Member Author

rob-luke commented May 3, 2021

I would approve, but you cant approve your own pull requests 😄

@larsoner
Copy link
Member

larsoner commented May 4, 2021

I also realized this is a perfect use case for channel specific annotations. I'll add that next I think

@larsoner
Copy link
Member

larsoner commented May 4, 2021

Okay made them channel-specific. If we know that they are always saturated in source-detector pairs like this (or that we should always annotate both as saturated if one of them is):

(Pdb) p raw.annotations.ch_names
array([('S5_D2 760',), ('S5_D2 850',), ('S5_D2 760',), ('S5_D2 850',), (),
       ('S5_D2 760',), ('S5_D2 850',), ()], dtype=object)

Then I could simplify the code a bit to produce half the number of channel-specific annotations. Currently it makes one annotation for S5_D2 760, and a separate annotation for S5_D2 850, but if these are always paired, we could make a single annotation for the time window that lists S5_D2 760 and S5_D2 850 together.

@rderollepot
Copy link
Contributor

Great !

I do not know the 'hardware based' answer to that question, but I made a simple script to find out where NaNs were located in both wl1 and wl2 matrices, and it turns out that, on the 47 datasets I have that saturated, wl1 and wl2 always share the exact same 'NaN mask'. It is not enough to guarantee that they are always saturated in source-detector pairs though.

@larsoner
Copy link
Member

larsoner commented May 4, 2021

My vote is to make it so that if either is saturated they both get annotated as saturated. It seems to be the case in 47/47 of your cases so in practice might be true anyway, and even if it isn't the case for some files, we probably will end up ignoring that S-D pair downstream anyway. WDYT?

@rob-luke
Copy link
Member Author

rob-luke commented May 4, 2021

My vote is to make it so that if either is saturated they both get annotated as saturated.

Agreed.

we probably will end up ignoring that S-D pair downstream anyway

Correct. The current behaviour is to mirror the behaviour within S-D pairs. So if one channel is marked as bad, then so is the other. See...

def _fnirs_spread_bads(raw):
"""Spread bad labeling across fnirs channels."""
# For an optode if any component (light frequency or chroma) is marked
# as bad, then they all should be. This function will find any pairs marked
# as bad and spread the bad marking to all components of the optode pair.
picks = _picks_to_idx(raw.info, 'fnirs', exclude=[])

@rob-luke
Copy link
Member Author

rob-luke commented May 4, 2021

Actually I need to add more internal calls to those functions. I'll do that in another PR.

@rderollepot
Copy link
Contributor

My vote is to make it so that if either is saturated they both get annotated as saturated.

I also agree with that 👍

@larsoner larsoner changed the title WIP: Add NIRSport support MRG, ENH: Add NIRSport support May 6, 2021
Comment on lines +393 to +395
.. _David Julien: https://github.com/Swy7ch

.. _Romain Derollepot: https://github.com/rderollepot
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@swy7ch @rderollepot let me know if you want different URLs here

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is great for me, thanks 👍

I also want to take this opportunity to say that I am really grateful for your help @larsoner and @rob-luke in seing that the great contribution @swy7ch started almost a year ago can come to an end, improving it with new features along the way. We are still very new to MNE in the lab (it started with David's internship), and I am not sure I would have been able to finish it myself (or it would have take me ages 😅).

* upstream/main:
  MRG, ENH: Make _get_hpi_info public (mne-tools#9369)
  ENH: Add a playback button to the notebook 3d backend (mne-tools#8741)
  better docs for permutation_cluster_test (mne-tools#9365)
  MRG: Add fNIRS to html output (mne-tools#9367)
  When plotting GFP comparison in Report, don't show sensor layout by default (mne-tools#9366)
  DOC: Update Mayavi troubleshooting section (mne-tools#9362)
  more tutorial tweaks (mne-tools#9359)
@larsoner larsoner merged commit 2d1e3df into mne-tools:main May 10, 2021
@rob-luke
Copy link
Member Author

🥳

larsoner added a commit to agramfort/mne-python that referenced this pull request May 10, 2021
* upstream/main:
  FIX: make epoch cropping idempotent (mne-tools#9378)
  MRG, ENH: Add NIRSport support (mne-tools#9348)
  MRG, ENH: Make _get_hpi_info public (mne-tools#9369)
  ENH: Add a playback button to the notebook 3d backend (mne-tools#8741)
  better docs for permutation_cluster_test (mne-tools#9365)
  MRG: Add fNIRS to html output (mne-tools#9367)
  When plotting GFP comparison in Report, don't show sensor layout by default (mne-tools#9366)
  DOC: Update Mayavi troubleshooting section (mne-tools#9362)
  more tutorial tweaks (mne-tools#9359)
  MRG, MAINT: Use native GitHub Actions skip (mne-tools#9361)
  MAINT: Clean up crufty code [circle front] (mne-tools#9358)
  API: Complete deprecations (mne-tools#9356)
  Add qdarkstyle, darkdetect to environment.yml [circle full] (mne-tools#9357)
  FIX: Fix
  FIX: Add
@rob-luke rob-luke deleted the addNIRSport branch April 8, 2022 05:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants