Skip to content

Commit

Permalink
Release candidate for v1.0.0 (#502)
Browse files Browse the repository at this point in the history
* enh: update to RC

* fix: add versionadded

* fix: update poetry dependencies

* enh: update deployment tools

* enh: update dependencies

* fix: correct error in doc building

* fix: error with tensorflow 3.13

* fix: add classes_ to SSVEP_CCA to comply with new sklearn version

* fix: precommit CI

* fix: SSVEP_CCA classes_ and onehot vector

* Try fix subject int instead of str

* update requirements.txt

* fix: remove requirement, pip compatible with pyproject.toml

* fix: avoid downloading dataset in CI

* fix: do not build VR examples to avoid zenodo timeout

* fix: revert to urllib3 <= 2 to fix timeout in CI

* revert: put back VR-PC example

* fix: revert to h5py 3.8

* fix: do not build examples with error after dependency update

* Improve msg

* Improve msg

* fixing crop

---------

Co-authored-by: Sylvain Chevallier <sylain.chevallier@universite-paris-saclay.fr>
Co-authored-by: PierreGtch <25532709+PierreGtch@users.noreply.github.com>
Co-authored-by: qbarthelemy <q.barthelemy@gmail.com>
Co-authored-by: bruAristimunha <a.bruno@aluno.ufabc.edu.br>
  • Loading branch information
5 people committed Oct 19, 2023
1 parent d8d4b7f commit 0f8b864
Show file tree
Hide file tree
Showing 25 changed files with 1,250 additions and 1,656 deletions.
14 changes: 7 additions & 7 deletions .github/workflows/docs.yml
Expand Up @@ -16,7 +16,7 @@ jobs:
python-version: ["3.9"]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Create local data folder
run: |
Expand Down Expand Up @@ -44,10 +44,10 @@ jobs:
- name: Install dependencies
if: steps.cached-dataset-docs.outputs.cache-hit != 'true'
run: poetry install --no-interaction --no-root --with docs,deeplearning
run: poetry install --no-interaction --no-root --with docs --extras deeplearning

- name: Install library
run: poetry install --no-interaction --with docs,deeplearning
run: poetry install --no-interaction --with docs --extras deeplearning

- name: Build docs
run: |
Expand All @@ -69,7 +69,7 @@ jobs:
os: [ubuntu-latest]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Create local data folder
run: |
Expand All @@ -85,7 +85,7 @@ jobs:
docs/build
- name: Checkout moabb.github.io
uses: actions/checkout@v3
uses: actions/checkout@v4
with:
repository: "NeuroTechX/moabb.github.io"
path: moabb-ghio
Expand All @@ -101,7 +101,7 @@ jobs:
os: [ubuntu-latest]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Create local data folder
run: |
Expand All @@ -117,7 +117,7 @@ jobs:
docs/build
- name: Checkout gh pages
uses: actions/checkout@v3
uses: actions/checkout@v4
with:
ref: gh-pages
path: moabb-ghpages
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/test-devel.yml
Expand Up @@ -19,7 +19,7 @@ jobs:
run:
shell: bash
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v4
Expand All @@ -46,11 +46,11 @@ jobs:
if: |
(runner.os != 'Windows') &&
(steps.cached-poetry-dependencies.outputs.cache-hit != 'true')
run: poetry install --no-interaction --no-root --with deeplearning
run: poetry install --no-interaction --no-root --extras deeplearning

- name: Install library (Linux/OSX)
if: ${{ runner.os != 'Windows' }}
run: poetry install --no-interaction --with deeplearning
run: poetry install --no-interaction --extras deeplearning

- name: Install library (Windows)
if: ${{ runner.os == 'Windows' }}
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/test.yml
Expand Up @@ -19,7 +19,7 @@ jobs:
run:
shell: bash
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Setup Python
uses: actions/setup-python@v4
Expand All @@ -46,11 +46,11 @@ jobs:
if: |
(runner.os != 'Windows') &&
(steps.cached-poetry-dependencies.outputs.cache-hit != 'true')
run: poetry install --no-interaction --no-root --with deeplearning
run: poetry install --no-interaction --no-root --extras deeplearning

- name: Install library (Linux/OSX)
if: ${{ runner.os != 'Windows' }}
run: poetry install --no-interaction --with deeplearning
run: poetry install --no-interaction --extras deeplearning

- name: Install library (Windows)
if: ${{ runner.os == 'Windows' }}
Expand Down
2 changes: 1 addition & 1 deletion bash/meta_requirements.txt
@@ -1,5 +1,5 @@
mne
#if you want used a specific branch instead of branch devolop
#if you want used a specific branch instead of branch develop
#changing the branch name git+REPOSITORY_LINK@BRANCH#egg=MOABB
git+https://github.com/NeuroTechX/moabb#egg=moabb
gdown
27 changes: 23 additions & 4 deletions docs/source/whats_new.rst
Expand Up @@ -18,6 +18,24 @@ Develop branch
Enhancements
~~~~~~~~~~~~

- None

Bugs
~~~~

- None

API changes
~~~~~~~~~~~

- None


Version - 1.0.0 (Stable - PyPi)
---------------------------------

Enhancements
~~~~~~~~~~~~

- Adding extra thank you section in the documentation (:gh:`390` by `Bruno Aristimunha`_)
- Adding new script to get the meta information of the datasets (:gh:`389` by `Bruno Aristimunha`_)
Expand Down Expand Up @@ -59,7 +77,7 @@ Bugs

- Restore 3 subject from Cho2017 (:gh:`392` by `Igor Carrara`_ and `Sylvain Chevallier`_)
- Correct downloading with VirtualReality BrainInvaders dataset (:gh:`393` by `Gregoire Cattan`_)
- Rename event `substraction` to `subtraction` in :func:`moabb.datasets.Shin2017B` (:gh:`397` by `Pierre Guetschel`_)
- Rename event `subtraction` in :func:`moabb.datasets.Shin2017B` (:gh:`397` by `Pierre Guetschel`_)
- Save parameters of :func:`moabb.datasets.PhysionetMI` (:gh:`403` by `Pierre Guetschel`_)
- Fixing issue with parallel evaluation (:gh:`401` by `Bruno Aristimunha`_ and `Igor Carrara`_)
- Fixing SSLError from BCI competition IV (:gh:`404` by `Bruno Aristimunha`_)
Expand All @@ -81,15 +99,16 @@ Bugs
- Removing joblib Parallel (:gh:`488` by `Igor Carrara`_)
- Fix case when events specified via ``raw.annotations`` but no events (:gh:`491` by `Pierre Guetschel`_)
- Fix bug in downloading Shin2017A dataset (:gh:`493` by `Igor Carrara`_)
- Fix the cropped option in the dataset preprocessing (:gh:`502` by `Bruno Aristimunha`_)

API changes
~~~~~~~~~~~

- None


Version - 0.5.0 (Stable - PyPi)
---------------------------------
Version - 0.5.0
---------------

Enhancements
~~~~~~~~~~~~
Expand Down Expand Up @@ -316,7 +335,7 @@ Bugs
- Use stim_channels or check annotation when loading files in Paradigm (:gh:`72` by `Jan Sosulski`_)
- Correct MNE issues (:gh:`76` by `Sylvain Chevallier`_)
- Fix capitalization in channel names of cho dataset (:gh:`90` by `Jan Sosulski`_)
- Correct failling CI tests (:gh:`100` by `Sylvain Chevallier`_)
- Correct failing CI tests (:gh:`100` by `Sylvain Chevallier`_)
- Fix EPFL dataset flat signal sections and wrong scaling (:gh:`104` and :gh:`96` by `Jan Sosulski`_)
- Fix schirrmeister dataset for Python3.8 (:gh:`105` by `Robin Schirrmeister`_)
- Correct event detection problem and duplicate event error (:gh:`106` by `Sylvain Chevallier`_)
Expand Down
File renamed without changes.
File renamed without changes.
4 changes: 2 additions & 2 deletions moabb/__init__.py
@@ -1,5 +1,5 @@
# flake8: noqa
__version__ = "0.5.0"
__version__ = "1.0.0"

from .benchmark import benchmark
from .utils import set_download_dir, set_log_level, setup_seed
from .utils import make_process_pipelines, set_download_dir, set_log_level, setup_seed
4 changes: 2 additions & 2 deletions moabb/analysis/results.py
Expand Up @@ -120,9 +120,9 @@ def add(self, results, pipelines, process_pipeline): # noqa: C901
"""Add results."""

def to_list(res):
if type(res) is dict:
if isinstance(res, dict):
return [res]
elif type(res) is not list:
elif not isinstance(res, list):
raise ValueError(
"Results are given as neither dict nor"
"list but {}".format(type(res).__name__)
Expand Down
8 changes: 7 additions & 1 deletion moabb/datasets/base.py
Expand Up @@ -44,6 +44,12 @@ class CacheConfig:
will be automatically downloaded to the specified folder.
verbose:
Verbosity level. See mne.verbose.
Notes
-----
.. versionadded:: 1.0.0
"""

save_raw: bool = False
Expand Down Expand Up @@ -272,7 +278,7 @@ def get_data(
process_pipeline=None,
):
"""
Return the data correspoonding to a list of subjects.
Return the data corresponding to a list of subjects.
The returned data is a dictionary with the following structure::
Expand Down
8 changes: 7 additions & 1 deletion moabb/datasets/bids_interface.py
Expand Up @@ -75,6 +75,12 @@ class BIDSInterfaceBase(abc.ABC):
The processing pipeline used to convert the data.
verbose : str
The verbosity level.
Notes
-----
.. versionadded:: 1.0.0
"""

dataset: "BaseDataset"
Expand Down Expand Up @@ -300,7 +306,7 @@ def _write_file(self, bids_path, raw):
if raw.info.get("subject_info", None) is None:
# specify subject info as required by BIDS
raw.info["subject_info"] = {
"his_id": self.subject,
"his_id": subject_moabb_to_bids(self.subject),
}
if raw.info.get("device_info", None) is None:
# specify device info as required by BIDS
Expand Down
2 changes: 1 addition & 1 deletion moabb/datasets/bnci.py
Expand Up @@ -1229,7 +1229,7 @@ class BNCI2015_004(MNEBNCI):
ground were placed at the left and right mastoid, respectively. The g.tec
GAMMAsys system with g.LADYbird active electrodes and two g.USBamp
biosignal
amplifiers (Guger Technolgies, Graz, Austria) was used for recording. EEG
amplifiers (Guger Technologies, Graz, Austria) was used for recording. EEG
was band pass filtered 0.5-100 Hz (notch filter at 50 Hz) and sampled at a
rate of 256 Hz.
Expand Down
2 changes: 1 addition & 1 deletion moabb/datasets/phmd_ml.py
Expand Up @@ -52,7 +52,7 @@ class Cattan2019_PHMD(BaseDataset):
Notes
-----
.. versionadded:: 0.6.0
.. versionadded:: 1.0.0
References
----------
Expand Down
6 changes: 3 additions & 3 deletions moabb/datasets/preprocessing.py
Expand Up @@ -125,8 +125,8 @@ def transform(self, raw, y=None):
if (
"Target" in event_id
and "NonTarget" in event_id
and type(event_id["Target"]) is list
and type(event_id["NonTarget"]) is list
and isinstance(event_id["Target"], list)
and isinstance(event_id["NonTarget"], list)
):
event_id_new = dict(Target=1, NonTarget=0)
events = mne.merge_events(events, event_id["Target"], 1)
Expand Down Expand Up @@ -254,7 +254,7 @@ def get_filter_pipeline(fmin, fmax):

def get_crop_pipeline(tmin, tmax):
return FunctionTransformer(
methodcaller("crop", tmin=tmax, tmax=tmin, verbose=False),
methodcaller("crop", tmin=tmin, tmax=tmax, verbose=False),
)


Expand Down
2 changes: 1 addition & 1 deletion moabb/evaluations/evaluations.py
Expand Up @@ -107,7 +107,7 @@ def __init__(
raise ValueError(
"When passing data_size, please also indicate number of permutations"
)
if type(n_perms) is int:
if isinstance(n_perms, int):
self.n_perms = np.full_like(self.data_size["value"], n_perms, dtype=int)
elif len(self.n_perms) != len(self.data_size["value"]):
raise ValueError(
Expand Down
6 changes: 6 additions & 0 deletions moabb/paradigms/cvep.py
Expand Up @@ -61,6 +61,12 @@ class BaseCVEP(BaseParadigm):
resample: float | None (default None)
If not None, resample the eeg data with the sampling rate provided.
Notes
-----
.. versionadded:: 1.0.0
"""

def __init__(
Expand Down
6 changes: 6 additions & 0 deletions moabb/paradigms/fixed_interval_windows.py
Expand Up @@ -45,6 +45,12 @@ class BaseFixedIntervalWindowsProcessing(BaseProcessing):
marker: int (default -1)
Marker to use for the events created.
Notes
-----
.. versionadded:: 1.0.0
"""

def __init__(
Expand Down
10 changes: 9 additions & 1 deletion moabb/pipelines/__init__.py
Expand Up @@ -5,6 +5,7 @@
"""

# flake8: noqa
from mne.utils import warn

from .classification import SSVEP_CCA, SSVEP_TRCA, SSVEP_MsetCCA
from .features import FM, AugmentedDataset, ExtendedSSVEPSignal, LogVariance
Expand All @@ -22,7 +23,14 @@
)
from .utils_deep_model import EEGNet, TCN_block
except ModuleNotFoundError as err:
print("Tensorflow not install, you could not use those pipelines")
warn(
"Tensorflow is not installed. "
"You won't be able to use these MOABB pipelines if you attempt to do "
"so.",
category=ModuleNotFoundError,
module="moabb.pipelines",
)


try:
from .utils_pytorch import (
Expand Down
2 changes: 2 additions & 0 deletions moabb/pipelines/classification.py
Expand Up @@ -51,8 +51,10 @@ def __init__(self, interval, freqs, n_harmonics=3):
self.slen = interval[1] - interval[0]
self.freqs = freqs
self.n_harmonics = n_harmonics
self.classes_ = []
self.one_hot = {}
for i, k in enumerate(freqs.keys()):
self.classes_.append(i)
self.one_hot[k] = i

def fit(self, X, y, sample_weight=None):
Expand Down
13 changes: 7 additions & 6 deletions moabb/tests/datasets.py
Expand Up @@ -308,12 +308,13 @@ def test_warning_if_parameters_false(self):
with self.assertWarns(UserWarning):
Cattan2019_VR(virtual_reality=False, screen_display=False)

def test_data_path(self):
ds = Cattan2019_VR(virtual_reality=True, screen_display=True)
data_path = ds.data_path(1)
assert len(data_path) == 2
assert "subject_01_VR.mat" in data_path[0]
assert "subject_01_PC.mat" in data_path[1]
# Access to Zenodo could fail on CI
# def test_data_path(self):
# ds = Cattan2019_VR(virtual_reality=True, screen_display=True)
# data_path = ds.data_path(1)
# assert len(data_path) == 2
# assert "subject_01_VR.mat" in data_path[0]
# assert "subject_01_PC.mat" in data_path[1]

def test_get_block_repetition(self):
ds = FakeVirtualRealityDataset()
Expand Down

0 comments on commit 0f8b864

Please sign in to comment.