Skip to content

Commit

Permalink
Adding Wiki Meta Information to the documentation (#317)
Browse files Browse the repository at this point in the history
* Updating the doc to include the metainfo

* Updating the poetry hyperlink to install;

* Updating the p300 meta information;

* Updating the SSVEP meta information;

* complete whats new

* trigger doc generation for all PR

* add benchmark and set_log_level in API

* correct typo

* update poetry link

* add contributing to docs

* add global meta-information table for dataset

* change bloc name

Co-authored-by: Sylvain Chevallier <sylvain.chevallier@universite-paris-saclay.fr>
Co-authored-by: Sylvain Chevallier <sylain.chevallier@universite-paris-saclay.fr>
  • Loading branch information
3 people committed Jan 3, 2023
1 parent 22cdcbb commit 6fb4795
Show file tree
Hide file tree
Showing 29 changed files with 466 additions and 6 deletions.
2 changes: 0 additions & 2 deletions .github/workflows/docs.yml
Expand Up @@ -5,8 +5,6 @@ on:
branches: [master, develop]
pull_request:
branches: [master, develop]
paths:
- "docs/**"

jobs:
build_docs:
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Expand Up @@ -57,7 +57,7 @@ pull request to the master branch referencing the specific issue you addressed.
## Setup development environment

1. install `poetry` (only once per machine):\
`curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -`\
`curl -sSL https://install.python-poetry.org | python3 -`\
or [checkout installation instruction](https://python-poetry.org/docs/#installation) or
use [conda forge version](https://anaconda.org/conda-forge/poetry)
1. (Optional, skip if not sure) Disable automatic environment creation:\
Expand Down
4 changes: 2 additions & 2 deletions README.md
Expand Up @@ -31,7 +31,7 @@ one of the sections below, or just scroll down to find out more.
- [Installation](#installation)
- [Running](#running)
- [Supported datasets](#supported-datasets)
- [Who are we? n](#who-are-we)
- [Who are we?](#who-are-we)
- [Get in touch](#contact-us)
- [Documentation][link_moabb_docs]
- [Architecture and main concepts](#architecture-and-main-concepts)
Expand Down Expand Up @@ -87,7 +87,7 @@ See [Troubleshooting](#Troubleshooting) section if you have a problem.
You could fork or clone the repository and go to the downloaded directory, then run:

1. install `poetry` (only once per machine):\
`curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -`\
`curl -sSL https://install.python-poetry.org | python3 -`\
or [checkout installation instruction](https://python-poetry.org/docs/#installation) or
use [conda forge version](https://anaconda.org/conda-forge/poetry)
1. (Optional, skip if not sure) Disable automatic environment creation:\
Expand Down
102 changes: 102 additions & 0 deletions docs/source/CONTRIBUTING.md
@@ -0,0 +1,102 @@
# Contributing

Contributions are always welcome, no matter how small.

The following is a small set of guidelines for how to contribute to the project

## Where to start

### Code of Conduct

This project adheres to the Contributor Covenant [Code of Conduct](CODE_OF_CONDUCT.md). By
participating you are expected to adhere to these expectations. Please report unacceptable
behavior to [hi@pushtheworld.us](mailto:hi@pushtheworld.us)

### Contributing on Github

If you're new to Git and want to learn how to fork this repo, make your own additions, and
include those additions in the master version of this project, check out this
[great tutorial](http://blog.davidecoppola.com/2016/11/howto-contribute-to-open-source-project-on-github/).

### Community

This project is maintained by the [NeuroTechX](www.neurotechx.com) community. Join the
[Gitter](https://gitter.im/moabb_dev/community), where discussions about MOABB takes
place.

## How can I contribute?

If there's a feature you'd be interested in building or you find a bug or have a
suggestion on how to improve the project, go ahead! Let us know on the
[Gitter](https://gitter.im/moabb_dev/community) or [open an issue](../../issues) so others
can follow along and we'll support you as much as we can. When you're finished submit a
pull request to the master branch referencing the specific issue you addressed.

### Steps to Contribute

1. Look for open issues or open one
1. Discuss the problem and or propose a solution
1. Fork it! (and clone fork locally)
1. Branch from `develop`: `git checkout --track develop`
1. [Setup development environment](#setup-development-environment)
1. Create your feature branch: `git checkout -b my-new-feature`
1. Make changes
1. Commit your changes: `git commit -m 'Add some feature'`
1. Don't forget to fix issues from `pre-commit` pipeline (either add changes made by hooks
or fix them manually in case of `flake8`)
1. Push to the branch: `git push origin my-new-feature`
1. Submit a pull request. Make sure it is based on the `develop` branch when submitting!
:D
1. Don't forget to update the
[what's new](http://moabb.neurotechx.com/docs/whats_new.html) and
[documentation](http://moabb.neurotechx.com/docs/index.html) pages if needed

## Setup development environment

1. install `poetry` (only once per machine):\
`curl -sSL https://install.python-poetry.org | python3 -`\
or [checkout installation instruction](https://python-poetry.org/docs/#installation) or
use [conda forge version](https://anaconda.org/conda-forge/poetry)
1. (Optional, skip if not sure) Disable automatic environment creation:\
`poetry config virtualenvs.create false`
1. install all dependencies in one command (have to be run in thibe project directory):\
`poetry install`
1. install `pre-commit` hooks to git repo:\
`pre-commit install`
1. you are ready to code!

_Note 1:_\
Your first commit will trigger `pre-commit` to download [Code Quality tools](#tools-used).
That's OK and it is intended behavior. This will be done once per machine automatically.

_Note 2:_\
By default `poetry` creates separate Python virtual environment for every project ([more details in documentation](https://python-poetry.org/docs/managing-environments/)).
If you use `conda` or any other way to manage different environments by hand - you need to
disable `poetry` environment creation. Also in this case be careful with version of Python
in your environment - it has to satisfy requirements stated in `pyproject.toml`. In case you
disable `poetry` you are in charge of this.

### Tools used

MOABB uses [poetry](https://python-poetry.org/) for dependency management. This tool
enables one to have a reproducible environment on all popular OS (Linux, MacOS and
Windows) and provides easy publishing pipeline.

Another tool that makes development more stable is [pre-commit](https://pre-commit.com/).
It automatically runs variety of Code Quality instruments against the code you produced.

For Code Quality verification, we use:

- [black](https://github.com/psf/black) - Python code formatting
- [isort](https://github.com/timothycrosley/isort) - imports sorting and grouping
- [flake8](https://gitlab.com/pycqa/flake8) - code style checking
- [prettier](https://github.com/prettier/prettier) - `.yml` and `.md` files formatting

### Generate the documentation

To generate a local version of the documentation:

```
cd docs
make html
```
2 changes: 1 addition & 1 deletion docs/source/README.md
Expand Up @@ -85,7 +85,7 @@ See [Troubleshooting](#Troubleshooting) section if you have a problem.
You could fork or clone the repository and go to the downloaded directory, then run:

1. install `poetry` (only once per machine):\
`curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -`\
`curl -sSL https://install.python-poetry.org | python3 -`\
or [checkout installation instruction](https://python-poetry.org/docs/#installation) or
use [conda forge version](https://anaconda.org/conda-forge/poetry)
1. (Optional, skip if not sure) Disable automatic environment creation:\
Expand Down
1 change: 1 addition & 0 deletions docs/source/api.rst
Expand Up @@ -3,3 +3,4 @@
.. include:: paradigms.rst
.. include:: pipelines.rst
.. include:: analysis.rst
.. include:: utils.rst
1 change: 1 addition & 0 deletions docs/source/conf.py
Expand Up @@ -232,6 +232,7 @@ def linkcode_resolve(domain, info): # noqa: C901
# an arbitrary url.
"navbar_links": [
("What's new", "whats_new"),
("Datasets", "dataset_summary"),
("API", "api"),
("Gallery", "auto_examples/index"),
("Tutorials", "auto_tutorials/index"),
Expand Down
56 changes: 56 additions & 0 deletions docs/source/dataset_summary.rst
@@ -0,0 +1,56 @@
MOABB gather many datasets, here is list summarizing important information. Most of the
datasets are listed here but this list not complete yet, check API for complete
documentation.

Do not hesitate to help us complete this list. It is also possible to add new datasets,
there is a tutorial explaining how to do so, and we welcome warmly any new contributions!

See also https://github.com/NeuroTechX/moabb/wiki/Datasets-Support for supplementary
detail on datasets (class name, size, licence, etc.)


================= ======= ======= ========== ================= ============ =============== ===========
Motor Imagery #Subj #Chan #Classes #Trials / class Trials len Sampling rate #Sessions
================= ======= ======= ========== ================= ============ =============== ===========
AlexMI 8 16 3 20 3s 512Hz 1
BNCI2014001 10 22 4 144 4s 250Hz 2
BNCI2014002 15 15 2 80 5s 512Hz 1
BNCI2014004 10 3 2 360 4.5s 250Hz 5
BNCI2015001 13 13 2 200 5s 512Hz 2
BNCI2015004 10 30 5 80 7s 256Hz 2
Cho2017 53 64 2 100 3s 512Hz 1
Lee2019_MI 55 62 2 100 4s 1000Hz 2
MunichMI 10 128 2 150 7s 500Hz 1
Schirrmeister2017 14 128 4 120 4s 500Hz 1
Ofner2017 15 61 7 60 3s 512Hz 1
PhysionetMI 109 64 4 23 3s 160Hz 1
Shin2017A 29 30 2 30 10s 200Hz 3
Shin2017B 29 30 2 30 10s 200Hz 3
Weibo2014 10 60 7 80 4s 200Hz 1
Zhou2016 4 14 3 160 5s 250Hz 3
================= ======= ======= ========== ================= ============ =============== ===========


=========== ======= ======= ================= =============== =============== ===========
P300 #Subj #Chan #Trials / class Trials length Sampling rate #Sessions
=========== ======= ======= ================= =============== =============== ===========
BNCI2014008 8 8 3500 NT / 700 T 1s 256Hz 1
BNCI2014009 10 16 1440 NT / 288 T 0.8s 256Hz 3
BNCI2015003 10 8 1500 NT / 300 T 0.8s 256Hz 1
bi2013a 24 16 3200 NT / 640 T 1s 512Hz 8
EPFLP300 8 32 2753 NT / 551 T 1s 2048Hz 4
Lee2019_ERP 54 62 6900 NT / 1380 T 1s 1000Hz 2
=========== ======= ======= ================= =============== =============== ===========


============= ======= ======= ========== ================= =============== =============== ===========
SSVEP #Subj #Chan #Classes #Trials / class Trials length Sampling rate #Sessions
============= ======= ======= ========== ================= =============== =============== ===========
Lee2019_SSVEP 24 16 4 25 1s 1000Hz 1
SSVEPExo 12 8 4 16 2s 256Hz 1
MAMEM1 10 256 5 12-15 3s 250Hz 1
MAMEM2 10 256 5 20-30 3s 250Hz 1
MAMEM3 10 14 4 20-30 3s 128Hz 1
Nakanishi2015 9 8 12 15 4.15s 256Hz 1
Wang2016 32 62 40 6 5s 250Hz 1
============= ======= ======= ========== ================= =============== =============== ===========
1 change: 1 addition & 0 deletions docs/source/index.rst
@@ -1,4 +1,5 @@
.. mdinclude:: README.md
.. mdinclude:: CONTRIBUTING.md

What's new
==========
Expand Down
27 changes: 27 additions & 0 deletions docs/source/utils.rst
@@ -0,0 +1,27 @@
=====
Utils
=====

.. automodule:: moabb

.. currentmodule:: moabb

---------
Benchmark
---------

.. autosummary::
:toctree: generated/
:template: function.rst

benchmark

-----
Utils
-----

.. autosummary::
:toctree: generated/
:template: function.rst

set_log_level
4 changes: 4 additions & 0 deletions docs/source/whats_new.rst
Expand Up @@ -18,7 +18,9 @@ Develop branch
Enhancements
~~~~~~~~~~~~

- Switch to python-3.8, update dependencies, fix code link in doc, add `code coverage <https://app.codecov.io/gh/NeuroTechX/moabb>`__ (:gh:`315` by `Sylvain Chevallier`_)
- Adding a comprehensive benchmarking function (:gh:`264` by `Divyesh Narayanan`_ and `Sylvain Chevallier`_)
- Add meta-information for datasets in documentation (:gh:`317` by `Bruno Aristimunha`_)

Bugs
~~~~
Expand All @@ -27,6 +29,7 @@ Bugs
- Preload Schirrmeister2017 raw files (:gh:`290` by `Pierre Guetschel`_)
- Incorrect event assignation for Lee2019 in MNE >= 1.0.0 (:gh:`298` by `Sylvain Chevallier`_)
- Correct usage of name simplification function in analyze (:gh:`306` by `Divyesh Narayanan`_)
- Fix downloading path issue for Weibo2014 and Zhou2016, numy error in DemonsP300 (:gh:`315` by `Sylvain Chevallier`_)

API changes
~~~~~~~~~~~
Expand Down Expand Up @@ -270,6 +273,7 @@ API changes



.. _Bruno Aristimunha: https://github.com/bruAristimunha
.. _Alexandre Barachant: https://github.com/alexandrebarachant
.. _Quentin Barthelemy: https://github.com/qbarthelemy
.. _Erik Bjäreholt: https://github.com/ErikBjare
Expand Down
27 changes: 27 additions & 0 deletions moabb/datasets/Lee2019.py
Expand Up @@ -229,6 +229,15 @@ def data_path(
class Lee2019_MI(Lee2019):
"""BMI/OpenBMI dataset for MI.
.. admonition:: Dataset summary
========== ======= ======= ========== ================= ============ =============== ===========
Name #Subj #Chan #Classes #Trials / class Trials len Sampling rate #Sessions
========== ======= ======= ========== ================= ============ =============== ===========
Lee2019_MI 55 62 2 100 4s 1000Hz 2
========== ======= ======= ========== ================= ============ =============== ===========
Dataset from Lee et al 2019 [1]_.
**Dataset Description**
Expand Down Expand Up @@ -290,6 +299,15 @@ class Lee2019_MI(Lee2019):
class Lee2019_ERP(Lee2019):
"""BMI/OpenBMI dataset for P300.
.. admonition:: Dataset summary
=========== ======= ======= ================= =============== =============== ===========
Name #Subj #Chan #Trials / class Trials length Sampling rate #Sessions
=========== ======= ======= ================= =============== =============== ===========
Lee2019_ERP 54 62 6900 NT / 1380 T 1s 1000Hz 2
=========== ======= ======= ================= =============== =============== ===========
Dataset from Lee et al 2019 [1]_.
**Dataset Description**
Expand Down Expand Up @@ -371,6 +389,15 @@ class Lee2019_ERP(Lee2019):
class Lee2019_SSVEP(Lee2019):
"""BMI/OpenBMI dataset for SSVEP.
.. admonition:: Dataset summary
============= ======= ======= ========== ================= =============== =============== ===========
Name #Subj #Chan #Classes #Trials / class Trials length Sampling rate #Sessions
============= ======= ======= ========== ================= =============== =============== ===========
Lee2019_SSVEP 24 16 4 25 1s 1000Hz 1
============= ======= ======= ========== ================= =============== =============== ===========
Dataset from Lee et al 2019 [1]_.
**Dataset Description**
Expand Down
9 changes: 9 additions & 0 deletions moabb/datasets/Weibo2014.py
Expand Up @@ -64,6 +64,15 @@ def get_subjects(sub_inds, sub_names, ind):
class Weibo2014(BaseDataset):
"""Motor Imagery dataset from Weibo et al 2014.
.. admonition:: Dataset summary
========= ======= ======= ========== ================= ============ =============== ===========
Name #Subj #Chan #Classes #Trials / class Trials len Sampling rate #Sessions
========= ======= ======= ========== ================= ============ =============== ===========
Weibo2014 10 60 7 80 4s 200Hz 1
========= ======= ======= ========== ================= ============ =============== ===========
Dataset from the article *Evaluation of EEG oscillatory patterns and
cognitive process during simple and compound limb motor imagery* [1]_.
Expand Down
9 changes: 9 additions & 0 deletions moabb/datasets/Zhou2016.py
Expand Up @@ -50,6 +50,15 @@ def local_data_path(base_path, subject):
class Zhou2016(BaseDataset):
"""Motor Imagery dataset from Zhou et al 2016.
.. admonition:: Dataset summary
======== ======= ======= ========== ================= ============ =============== ===========
Name #Subj #Chan #Classes #Trials / class Trials len Sampling rate #Sessions
======== ======= ======= ========== ================= ============ =============== ===========
Zhou2016 4 14 3 160 5s 250Hz 3
======== ======= ======= ========== ================= ============ =============== ===========
Dataset from the article *A Fully Automated Trial Selection Method for
Optimization of Motor Imagery Based Brain-Computer Interface* [1]_.
This dataset contains data recorded on 4 subjects performing 3 type of
Expand Down
9 changes: 9 additions & 0 deletions moabb/datasets/alex_mi.py
Expand Up @@ -14,6 +14,15 @@
class AlexMI(BaseDataset):
"""Alex Motor Imagery dataset.
.. admonition:: Dataset summary
====== ======= ======= ========== ================= ============ =============== ===========
Name #Subj #Chan #Classes #Trials / class Trials len Sampling rate #Sessions
====== ======= ======= ========== ================= ============ =============== ===========
AlexMI 8 16 3 20 3s 512Hz 1
====== ======= ======= ========== ================= ============ =============== ===========
Motor imagery dataset from the PhD dissertation of A. Barachant [1]_.
This Dataset contains EEG recordings from 8 subjects, performing 2 task of
Expand Down

0 comments on commit 6fb4795

Please sign in to comment.