-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to handle datasets with invalid info[meas_id][secs]? #7803
Comments
The FIF format in particular has a limit on how large a span of dates it can write because it writes out seconds in As to how to fix it, you can set it to zero and things will work (unless you have saved separate annotations you want to add), but be careful if you ever want to do something having to do with dates across multiple subjects or runs. Typically during anonymization you shift all subjects and runs by some fixed amount so that their relative timings stay fixed. Wiping out the |
for the record here the file comes from a non-bids valid dataset as we made
sure dates for bids
MEG are compatible with fif.
… |
I would check what the date is. 5364633480 is about 170 years so my guess
is that this data has been anonymized using some method that makes that
value not meaningful.
If you want to be extra cautious, preserving as much information as you can
in case it is relevant, you could use `raw.anonymize()` - which should
time shift everything so that `meas_date` in range while preserving the
timedelta between meas_date the other dates in the file.
https://mne.tools/stable/generated/mne.io.Raw.html#mne.io.Raw.anonymize
…On Thu, May 21, 2020 at 5:13 PM Alexandre Gramfort ***@***.***> wrote:
for the record here the file comes from a non-bids valid dataset as we made
sure dates for bids
MEG are compatible with fif.
>
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#7803 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKTXHMQZLBLY2S4XWDPKBLRSWKQJANCNFSM4NGJOCFQ>
.
|
It does pass validation with the BIDS validator though. We should probably file a bug report.
…--
Sent from my phone, please excuse brevity and erroneous auto-correct.
On 21. May 2020, at 23:13, Alexandre Gramfort ***@***.***> wrote:
for the record here the file comes from a non-bids valid dataset as we made
sure dates for bids
MEG are compatible with fif.
>
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
bids validator cannot read meg files just the file names so he cannot
detect these issues.
… |
Wait, so you're saying there's BIDS-relevant metadata stored in a file format that the BIDS validator cannot read? Shouldn't this be stored in a sidecar file, like the events?? |
This raises an interesting question. What is the expectation if bids sidecar information differs from what is stored in the underlying imaging data headers? |
I believe the sidecar-based values always take precedence. |
I believe the sidecar-based values always take precedence.
+1
|
Same issue here with the Temple University TUAR dataset. Ended up just dropping the meas_date. |
I'm woking with the
ds000246
OpenNeuro dataset:Reading the data works as expected:
Writing thows an exception:
Traceback:
How to best deal with data like this? Can I simply set
info[meas_id][secs]
to an arbitrary (valid) value? Also it seems a little odd that I can create (and work with) some data by reading it, but then cannot write it back to disk…The text was updated successfully, but these errors were encountered: