Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

incorrect task parameter value in session metadata for TRAINING_0 and TRAINING_1 #2717

Open
matchings opened this issue Aug 19, 2023 · 1 comment
Labels

Comments

@matchings
Copy link
Collaborator

matchings commented Aug 19, 2023

Describe the bug
The blank_duration_sec value in the task_parameters attribute of the BehaviorSessions object appear to be incorrect for the TRAINING_0 and TRAINING_1 sessions.

The blank_duration_sec is listed as [0.5, 0.5] however there are no blanks in either TRAINING_0 or TRAINING_1.

In addition, the value of n_stimulus_frames appears to be wildly off for the TRAINING_1 session shown below.

To Reproduce

image image

Typical n_stimulus_frames values for different session types:

image

Expected behavior
The value of blank_duration_sec should be set to NaN for these sessions to indicate that the stimuli are presented continuously with no blank gray screen in between.

The value of n_stimulus_frames should be around 70,000 frames for all sessions except for TRAINING_0, which is only 15 minutes long.

Actual Behavior
The blank_duration_sec is listed as [0.5, 0.5] however there are no blanks in either TRAINING_0 or TRAINING_1.

The value of n_stimulus_frames for behavior_session_id = 872953842 is 216,128. This must be incorrect.

Environment (please complete the following information):

  • AllenSDK version 2.15.2

Additional context
This would need to be fixed in all TRAINING_0 and TRAINING_1 behavior sessions and ophys / ecephys sessions for both VBO and VBN.

Given the imminent re-release of both datasets, we should discuss whether this is of sufficient priority to incorporate. @corbennett what are your thoughts?

From my perspective, the blank_duration_sec error is something we can explain in documentation. However I am not sure whether the n_stimulus_frames error is indicative of a deeper problem, or if it is just a one off issue.

@matchings matchings added the bug label Aug 19, 2023
@morriscb
Copy link
Contributor

Hey @matchings, apologies for not getting to this issue sooner.

Having looked at the data/code that produces n_stimulus_frames (

) I'm not sure the value of 216,128 (and the similarly large values for all TRAINING_1 sessions) is incorrect. As you mention above, TRAINING_0 and TRAINING_1 are presented continuously. The data used to calculate n_stimulus_frames appears to be a boolean array of values for when the stimulus is on vs off. As they are continuous with no gaps as you describe, these arrays are mostly full of 1s which the sum basically being equal to the length of the stimulus block. Indeed if you look at all TRAINING_0 data, you'll see that n_stimulus_frames is always around 50k frames as the "draw_log" in that session is mostly full of 1s (as in a continuously drawn stimulus). As TRAINING_1 is also continuous and is roughly 216k frames long in total, the value you are seeing is saying that a gratings stimulus is being drawn continuously for almost the full time of the stimulus. Hopefully this make sense.

As for "blank_duration_sec", if you are fine with addressing it in documentation, then there doesn't seem to be a need to reproduce the nwbs? For reference, this value is calculated here:

blank_duration_sec = [float(x) for x in doc['blank_duration_range']]
I have confirmed that 0.5 is indeed the value stored in the stimulus file for an example TRAINING_0 and TRAINING_1 sessions so the stimulus file data appears to not have been recorded properly for these two sessions.

Happy to discuss this further if needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants