Simplify and Better Document Getting Own Datasets into Braindecode #544
Labels
documentation
Improvements or additions to documentation
enhancement
New feature or request
intermediate
Intermediate Difficulty
I think we should make it easily understandable how to get your own data into a braindecode dataset (also in cases you don't want to use skorch, but would like to use braindecode dataset).
I see two basic scenarios:
Atm we have https://braindecode.org/dev/auto_examples/datasets_io/plot_custom_dataset_example.html#sphx-glr-auto-examples-datasets-io-plot-custom-dataset-example-py This is already not so bad, I suggest to rename the example mentioning X and y numpy arrays in the title to make it more clear what this is about. Also simplify the example, like don't load some mne dataset, extract X, y etc., rather just create fake X and y, makes it much shorter and easier to understand I would say. Especially as we are not doing any training in that example anyways...
We may also want to allow there y to have a temporal dimension for segmentation-like tasks. For this, one would need to make code changes I guess.
Additionally, we may need to distinguish between cases you have a raw or a precut X and y, e.g. see #148
Here atm we have https://braindecode.org/dev/auto_examples/datasets_io/plot_mne_dataset_example.html#sphx-glr-auto-examples-datasets-io-plot-mne-dataset-example-py
Maybe could also be renamed to signal that this is Braindecode general-purpose way of getting any data into a Braindecode dataset.
Here on the type of datasets we should ensure we have a simple API and show how to use it for:
And we should probably cover that either one is willing to load all the data into memory or already has or is willing to create mne if for other files on disk (so
preload
True or False)I also had some colab showing how to get data into mne, but this might be better to link to some appropriate mne doc if it exists?
https://colab.research.google.com/drive/1B-5K7dNyfyu-UIVFp3A1BvgQGkZmLWrg#scrollTo=YbFKRInJCGYw
The text was updated successfully, but these errors were encountered: