Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate TypeConfig #1829

Merged
merged 51 commits into from
May 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
51 commits
Select commit Hold shift + click to select a range
52cb3c5
integration
mavaylon1 Jan 18, 2024
18911f2
path
mavaylon1 Jan 19, 2024
0908d65
clean up
mavaylon1 Jan 19, 2024
37939f3
update
mavaylon1 Jan 25, 2024
97487be
Delete out.txt
mavaylon1 Jan 25, 2024
e50d951
update
mavaylon1 Jan 25, 2024
0caa44a
need to clean
mavaylon1 Jan 31, 2024
5bd2a25
partial clean up
mavaylon1 Jan 31, 2024
9fdd1c3
Merge branch 'dev' into tmc
mavaylon1 Jan 31, 2024
ac8d839
checkpoint
mavaylon1 Feb 21, 2024
b3fcbf8
checkpoint
mavaylon1 Mar 3, 2024
66747de
check
mavaylon1 Mar 3, 2024
db64e6b
Merge branch 'dev' into tmc
mavaylon1 Apr 8, 2024
9f54ac6
Delete docs/source/sg_execution_times.rst
mavaylon1 Apr 8, 2024
c02993b
Update ecephys.py
mavaylon1 Apr 8, 2024
4572954
checkpoint
mavaylon1 Apr 8, 2024
49215a5
test
mavaylon1 Apr 8, 2024
372dfcf
test
mavaylon1 Apr 8, 2024
14fef28
update
mavaylon1 Apr 10, 2024
2dcc091
checkpoint passing
mavaylon1 Apr 10, 2024
ba7e851
Merge branch 'dev' into tmc
mavaylon1 May 20, 2024
f39aae5
clean up
mavaylon1 May 20, 2024
7adf428
tutorial draft
mavaylon1 May 20, 2024
cfd1561
docs
mavaylon1 May 20, 2024
d1bc20f
checkpoint
mavaylon1 May 20, 2024
7613f36
subject
mavaylon1 May 21, 2024
4f5854a
gallery
mavaylon1 May 21, 2024
d0f1b87
gallery
mavaylon1 May 21, 2024
c75db7d
Update requirements-min.txt
mavaylon1 May 21, 2024
8a875fa
Update requirements.txt
mavaylon1 May 21, 2024
9f8e856
rebase
mavaylon1 May 21, 2024
72bb2a9
test
mavaylon1 May 21, 2024
209a07d
test
mavaylon1 May 21, 2024
091631f
coverage
mavaylon1 May 21, 2024
453c56d
Create requirements-opt.txt
mavaylon1 May 21, 2024
ffdb838
Update check_sphinx_links.yml
mavaylon1 May 21, 2024
f63d59d
link
mavaylon1 May 21, 2024
078149f
Update tox.ini
mavaylon1 May 21, 2024
f3e1c1c
Update run_coverage.yml
mavaylon1 May 21, 2024
e2436ac
Update tox.ini
mavaylon1 May 21, 2024
bc1fc67
Update run_coverage.yml
rly May 21, 2024
6fbd681
Update docs/gallery/general/plot_configurator.py
mavaylon1 May 21, 2024
4dd4ca1
Update docs/gallery/general/plot_configurator.py
mavaylon1 May 21, 2024
355046a
feedback
mavaylon1 May 22, 2024
d1d7fea
Update CHANGELOG.md
mavaylon1 May 22, 2024
1b6e20d
Update plot_configurator.py
rly May 22, 2024
b972603
Update plot_configurator.py
mavaylon1 May 22, 2024
c8e81e3
Update docs/gallery/general/nwb_gallery_config.yaml
mavaylon1 May 22, 2024
8f2a1f1
Update src/pynwb/config/nwb_config.yaml
mavaylon1 May 22, 2024
981a7b6
Merge branch 'dev' into tmc
rly May 22, 2024
4fe39f0
Merge branch 'dev' into tmc
mavaylon1 May 22, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/check_sphinx_links.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ jobs:
- name: Install Sphinx dependencies and package
run: |
python -m pip install --upgrade pip
python -m pip install -r requirements-doc.txt
python -m pip install -r requirements-doc.txt -r requirements-opt.txt
python -m pip install .

- name: Check Sphinx internal and external links
Expand Down
7 changes: 5 additions & 2 deletions .github/workflows/run_all_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,20 +26,23 @@ jobs:
- { name: linux-python3.9 , test-tox-env: py39 , build-tox-env: build-py39 , python-ver: "3.9" , os: ubuntu-latest }
- { name: linux-python3.10 , test-tox-env: py310 , build-tox-env: build-py310 , python-ver: "3.10", os: ubuntu-latest }
- { name: linux-python3.11 , test-tox-env: py311 , build-tox-env: build-py311 , python-ver: "3.11", os: ubuntu-latest }
- { name: linux-python3.11-opt , test-tox-env: py311-optional , build-tox-env: build-py311 , python-ver: "3.11", os: ubuntu-latest }
- { name: linux-python3.12 , test-tox-env: py312 , build-tox-env: build-py312 , python-ver: "3.12", os: ubuntu-latest }
- { name: linux-python3.12-upgraded , test-tox-env: py312-upgraded , build-tox-env: build-py312-upgraded , python-ver: "3.12", os: ubuntu-latest }
- { name: linux-python3.12-prerelease , test-tox-env: py312-prerelease, build-tox-env: build-py312-prerelease, python-ver: "3.12", os: ubuntu-latest }
- { name: windows-python3.8-minimum , test-tox-env: py38-minimum , build-tox-env: build-py38-minimum , python-ver: "3.8" , os: windows-latest }
- { name: windows-python3.9 , test-tox-env: py39 , build-tox-env: build-py39 , python-ver: "3.9" , os: windows-latest }
- { name: windows-python3.10 , test-tox-env: py310 , build-tox-env: build-py310 , python-ver: "3.10", os: windows-latest }
- { name: windows-python3.11 , test-tox-env: py311 , build-tox-env: build-py311 , python-ver: "3.11", os: windows-latest }
- { name: windows-python3.11-opt , test-tox-env: py311-optional , build-tox-env: build-py311 , python-ver: "3.11", os: windows-latest }
- { name: windows-python3.12 , test-tox-env: py312 , build-tox-env: build-py312 , python-ver: "3.12", os: windows-latest }
- { name: windows-python3.12-upgraded , test-tox-env: py312-upgraded , build-tox-env: build-py312-upgraded , python-ver: "3.12", os: windows-latest }
- { name: windows-python3.12-prerelease, test-tox-env: py312-prerelease, build-tox-env: build-py312-prerelease, python-ver: "3.11", os: windows-latest }
- { name: macos-python3.8-minimum , test-tox-env: py38-minimum , build-tox-env: build-py38-minimum , python-ver: "3.8" , os: macos-13 }
- { name: macos-python3.9 , test-tox-env: py39 , build-tox-env: build-py39 , python-ver: "3.9" , os: macos-13 }
- { name: macos-python3.10 , test-tox-env: py310 , build-tox-env: build-py310 , python-ver: "3.10", os: macos-latest }
- { name: macos-python3.11 , test-tox-env: py311 , build-tox-env: build-py311 , python-ver: "3.11", os: macos-latest }
- { name: macos-python3.11-opt , test-tox-env: py311-optional , build-tox-env: build-py311 , python-ver: "3.11", os: macos-latest }
- { name: macos-python3.12 , test-tox-env: py312 , build-tox-env: build-py312 , python-ver: "3.12", os: macos-latest }
- { name: macos-python3.12-upgraded , test-tox-env: py312-upgraded , build-tox-env: build-py312-upgraded , python-ver: "3.12", os: macos-latest }
- { name: macos-python3.12-prerelease , test-tox-env: py312-prerelease, build-tox-env: build-py312-prerelease, python-ver: "3.12", os: macos-latest }
Expand Down Expand Up @@ -198,7 +201,7 @@ jobs:
include:
- { name: conda-linux-python3.12-ros3 , python-ver: "3.12", os: ubuntu-latest }
- { name: conda-windows-python3.12-ros3, python-ver: "3.12", os: windows-latest }
- { name: conda-macos-python3.12-ros3 , python-ver: "3.12", os: macos-13 } # This is due to DANDI not supporting osx-arm64. Will support macos-latest when this changes.
- { name: conda-macos-python3.12-ros3 , python-ver: "3.12", os: macos-13 } # This is due to DANDI not supporting osx-arm64. Will support macos-latest when this changes.
steps:
- name: Cancel non-latest runs
uses: styfle/cancel-workflow-action@0.11.0
Expand Down Expand Up @@ -245,7 +248,7 @@ jobs:
include:
- { name: conda-linux-gallery-python3.12-ros3 , python-ver: "3.12", os: ubuntu-latest }
- { name: conda-windows-gallery-python3.12-ros3, python-ver: "3.12", os: windows-latest }
- { name: conda-macos-gallery-python3.12-ros3 , python-ver: "3.12", os: macos-13 } # This is due to DANDI not supporting osx-arm64. Will support macos-latest when this changes.
- { name: conda-macos-gallery-python3.12-ros3 , python-ver: "3.12", os: macos-13 } # This is due to DANDI not supporting osx-arm64. Will support macos-latest when this changes.
steps:
- name: Cancel non-latest runs
uses: styfle/cancel-workflow-action@0.11.0
Expand Down
14 changes: 9 additions & 5 deletions .github/workflows/run_coverage.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ on:

jobs:
run-coverage:
name: ${{ matrix.os }}
name: ${{ matrix.os }}, opt reqs ${{ matrix.opt_req }}
runs-on: ${{ matrix.os }}
# TODO handle forks
# run pipeline on either a push event or a PR event on a fork
Expand All @@ -21,7 +21,11 @@ jobs:
shell: bash
strategy:
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
include:
- { os: ubuntu-latest , opt_req: true }
- { os: ubuntu-latest , opt_req: false }
- { os: windows-latest, opt_req: false }
- { os: macos-latest , opt_req: false }
env:
OS: ${{ matrix.os }}
PYTHON: '3.12'
Expand All @@ -47,9 +51,9 @@ jobs:
python -m pip install --upgrade pip
python -m pip install -r requirements-dev.txt -r requirements.txt

# - name: Install optional dependencies
# if: ${{ matrix.opt_req }}
# run: python -m pip install -r requirements-opt.txt
- name: Install optional dependencies
if: ${{ matrix.opt_req }}
run: python -m pip install -r requirements-opt.txt

- name: Install package
run: |
Expand Down
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

### Enhancements and minor changes
- Set rate default value inside `mock_ElectricalSeries` to avoid having to set `rate=None` explicitly when passing timestamps. @h-mayorquin [#1894](https://github.com/NeurodataWithoutBorders/pynwb/pull/1894)
- Integrate validation through the `TypeConfigurator`. @mavaylon1 [#1829](https://github.com/NeurodataWithoutBorders/pynwb/pull/1829)
- Exposed `aws_region` to `NWBHDF5IO`. @rly [#1903](https://github.com/NeurodataWithoutBorders/pynwb/pull/1903)

## PyNWB 2.7.0 (May 2, 2024)
Expand Down
17 changes: 17 additions & 0 deletions docs/gallery/general/experimenter_termset.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
id: termset/experimenter_example
name: Experimenter
version: 0.0.1
prefixes:
ORC: https://orcid.org/
imports:
- linkml:types
default_range: string

enums:
Experimenters:
permissible_values:
Bilbo Baggins:
description: He who must not be named.
meaning: ORC:111


10 changes: 10 additions & 0 deletions docs/gallery/general/nwb_gallery_config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
namespaces:
core:
version: 2.7.0
data_types:
Subject:
species:
termset: nwb_subject_termset.yaml
NWBFile:
experimenter:
termset: experimenter_termset.yaml
27 changes: 27 additions & 0 deletions docs/gallery/general/nwb_subject_termset.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
id: termset/species_example
name: Species
version: 0.0.1
prefixes:
NCBI_TAXON: https://www.ncbi.nlm.nih.gov/Taxonomy/Browser/wwwtax.cgi?mode=Info&id=
imports:
- linkml:types
default_range: string

enums:
Species:
permissible_values:
Homo sapiens:
description: the species is human
meaning: NCBI_TAXON:9606
Mus musculus:
description: the species is a house mouse
meaning: NCBI_TAXON:10090
Ursus arctos horribilis:
description: the species is a grizzly bear
meaning: NCBI_TAXON:116960
Myrmecophaga tridactyla:
description: the species is an anteater
meaning: NCBI_TAXON:71006
Ailuropoda melanoleuca:
description: the species is a panda
meaning: NCBI_TAXON:9646
113 changes: 113 additions & 0 deletions docs/gallery/general/plot_configurator.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
"""
How to Configure Term Validations
=================================

This is a user guide for how to curate and take advantage of configuration files in
order to more easily validate terms within datasets or attributes.

Introduction
-------------
Users will create a configuration YAML file that outlines the fields (within a neurodata type)
they want to be validated against a set of allowed terms.
After creating the configuration file, users will need to load the
configuration file with the :py:func:`~pynwb.load_type_config` method.
With the configuration loaded, every instance of the neurodata
types defined in the configuration file will have the respective fields wrapped with a
:py:class:`~hdmf.term_set.TermSetWrapper`.
This automatic wrapping is what provides the term validation for the field value.
For greater control on which datasets and attributes are validated
against which sets of allowed terms, use the
:py:class:`~hdmf.term_set.TermSetWrapper` on individual datasets and attributes instead.
You can follow the
`TermSet tutorial in the HDMF documentation
<https://hdmf.readthedocs.io/en/stable/tutorials/plot_term_set.html#sphx-glr-tutorials-plot-term-set-py>`_
for more information.

To unload a configuration, simply call :py:func:`~pynwb.unload_type_config`.
We also provide a helper method to see the configuration that has been loaded:
:py:func:`~pynwb.get_loaded_type_config`


How to make a Configuration File
--------------------------------
To see an example of a configuration file, please refer to
`<https://github.com/NeurodataWithoutBorders/pynwb/tree/dev/src/pynwb/config/nwb_config.yaml>`_.
rly marked this conversation as resolved.
Show resolved Hide resolved
The configuration file uses YAML syntax. The
user will construct a series of nested dictionaries to encompass all the necessary information.
mavaylon1 marked this conversation as resolved.
Show resolved Hide resolved

1. The user needs to define all the relevant namespaces. Recall that each neurodata type exists within
a namespace, whether that is the core namespace in PyNWB or a namespace in an extension. As namespaces grow,
we also require a version to be recorded in the configuration file to ensure proper functionality.
2. Within a namespace dictionary, the user will have a list of data types they want to configure.
3. Each data type will have a list of fields associated with a :py:class:`~hdmf.term_set.TermSet`.
The user can use the same or unique TermSet instances for each field.
"""
try:
import linkml_runtime # noqa: F401
except ImportError as e:
raise ImportError("Please install linkml-runtime to run this example: pip install linkml-runtime") from e

from dateutil import tz
from datetime import datetime
from uuid import uuid4
import os

from pynwb import NWBFile, get_loaded_type_config, load_type_config, unload_type_config
from pynwb.file import Subject

# How to use a Configuration file
# -------------------------------
# As mentioned prior, the first step after creating a configuration file is
# to load the file. In this configuration file, we have defined two fields
# we want to always be validated: ``experimenter`` and ``species``. Each of these
# are from a different neurodata type, :py:class:`~pynwb.file.NWBFile` and
# :py:class:`~pynwb.file.Subject` respectively, and each
# have a unique associated :py:class:`~hdmf.term_set.TermSet`.
# It is important to remember that with the configuration loaded, the fields
# are wrapped automatically, meaning the user should proceed with creating
# the instances normally, i.e., without wrapping directly. Once instantiated,
# the value of the fields are wrapped and then validated to see if it is a
# permissible value in their respective :py:class:`~hdmf.term_set.TermSet`.

dir_path = os.path.dirname(os.path.abspath("__file__"))
yaml_file = os.path.join(dir_path, 'nwb_gallery_config.yaml')
load_type_config(config_path=yaml_file)

session_start_time = datetime(2018, 4, 25, hour=2, minute=30, second=3, tzinfo=tz.gettz("US/Pacific"))

nwbfile = NWBFile(
session_description="Mouse exploring an open field", # required
identifier=str(uuid4()), # required
session_start_time=session_start_time, # required
session_id="session_1234", # optional
experimenter=[
"Bilbo Baggins",
], # optional
lab="Bag End Laboratory", # optional
institution="University of My Institution", # optional
experiment_description="I went on an adventure to reclaim vast treasures.", # optional
related_publications="DOI:10.1016/j.neuron.2016.12.011", # optional
)

subject = Subject(
subject_id="001",
age="P90D",
description="mouse 5",
species="Mus musculus",
sex="M",
)

nwbfile.subject = subject
mavaylon1 marked this conversation as resolved.
Show resolved Hide resolved

####################################
# How to see the Configuration file
# ---------------------------------
# Call :py:class:`~pynwb.get_loaded_type_config` to get a dictionary containing the
# current configuration.
config = get_loaded_type_config()

######################################
# How to unload the Configuration file
# ------------------------------------
# Call :py:class:`~pynwb.unload_type_config` to toggle off the automatic validation.
unload_type_config()
3 changes: 3 additions & 0 deletions requirements-opt.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
linkml-runtime==1.7.4; python_version >= "3.9"
schemasheets==0.2.1; python_version >= "3.9"
oaklib==0.5.32; python_version >= "3.9"
31 changes: 31 additions & 0 deletions src/pynwb/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,44 @@
from hdmf.backends.hdf5 import HDF5IO as _HDF5IO
from hdmf.build import BuildManager, TypeMap
import hdmf.common
from hdmf.common import load_type_config as hdmf_load_type_config
from hdmf.common import get_loaded_type_config as hdmf_get_loaded_type_config
from hdmf.common import unload_type_config as hdmf_unload_type_config


CORE_NAMESPACE = 'core'

from .spec import NWBDatasetSpec, NWBGroupSpec, NWBNamespace # noqa E402
from .validate import validate # noqa: F401, E402


@docval({'name': 'config_path', 'type': str, 'doc': 'Path to the configuration file.'},
{'name': 'type_map', 'type': TypeMap, 'doc': 'The TypeMap.', 'default': None},
is_method=False)
def load_type_config(**kwargs):
"""
This method will either load the default config or the config provided by the path.
"""
config_path = kwargs['config_path']
type_map = kwargs['type_map'] or get_type_map()

Check warning on line 34 in src/pynwb/__init__.py

View check run for this annotation

Codecov / codecov/patch

src/pynwb/__init__.py#L33-L34

Added lines #L33 - L34 were not covered by tests

hdmf_load_type_config(config_path=config_path, type_map=type_map)

Check warning on line 36 in src/pynwb/__init__.py

View check run for this annotation

Codecov / codecov/patch

src/pynwb/__init__.py#L36

Added line #L36 was not covered by tests

@docval({'name': 'type_map', 'type': TypeMap, 'doc': 'The TypeMap.', 'default': None},
is_method=False)
def get_loaded_type_config(**kwargs):
type_map = kwargs['type_map'] or get_type_map()
return hdmf_get_loaded_type_config(type_map=type_map)

Check warning on line 42 in src/pynwb/__init__.py

View check run for this annotation

Codecov / codecov/patch

src/pynwb/__init__.py#L41-L42

Added lines #L41 - L42 were not covered by tests

@docval({'name': 'type_map', 'type': TypeMap, 'doc': 'The TypeMap.', 'default': None},
is_method=False)
def unload_type_config(**kwargs):
"""
Remove validation.
"""
type_map = kwargs['type_map'] or get_type_map()
hdmf_unload_type_config(type_map=type_map)

Check warning on line 51 in src/pynwb/__init__.py

View check run for this annotation

Codecov / codecov/patch

src/pynwb/__init__.py#L50-L51

Added lines #L50 - L51 were not covered by tests

def __get_resources():
try:
from importlib.resources import files
Expand Down
7 changes: 7 additions & 0 deletions src/pynwb/config/nwb_config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
namespaces:
core:
version: 2.7.0
data_types:
Subject:
species:
termset: nwb_subject_termset.yaml
27 changes: 27 additions & 0 deletions src/pynwb/config/nwb_subject_termset.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
id: termset/species_example
name: Species
version: 0.0.1
prefixes:
NCBI_TAXON: https://www.ncbi.nlm.nih.gov/Taxonomy/Browser/wwwtax.cgi?mode=Info&id=
imports:
- linkml:types
default_range: string

enums:
Species:
permissible_values:
Homo sapiens:
description: the species is human
meaning: NCBI_TAXON:9606
Mus musculus:
description: the species is a house mouse
meaning: NCBI_TAXON:10090
Ursus arctos horribilis:
description: the species is a grizzly bear
meaning: NCBI_TAXON:116960
Myrmecophaga tridactyla:
description: the species is an anteater
meaning: NCBI_TAXON:71006
Ailuropoda melanoleuca:
description: the species is a panda
meaning: NCBI_TAXON:9646
13 changes: 13 additions & 0 deletions src/pynwb/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
from hdmf.utils import LabelledDict # noqa: F401

from . import CORE_NAMESPACE, register_class
from pynwb import get_type_map


def _not_parent(arg):
Expand Down Expand Up @@ -46,6 +47,18 @@ def _error_on_new_warn_on_construct(self, error_msg: str):
raise ValueError(error_msg)
warn(error_msg)

def _get_type_map(self):
return get_type_map()

@property
def data_type(self):
"""
Return the spec data type associated with this container, i.e., the neurodata_type.
"""
# we need this function here to use the correct _data_type_attr.
_type = getattr(self, self._data_type_attr)
return _type


@register_class('NWBContainer', CORE_NAMESPACE)
class NWBContainer(NWBMixin, Container):
Expand Down
5 changes: 5 additions & 0 deletions test.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,6 +136,11 @@ def __run_example_tests_helper(examples_scripts):
ws.append(w)
for w in ws:
warnings.showwarning(w.message, w.category, w.filename, w.lineno, w.line)
except (ImportError, ValueError, ModuleNotFoundError) as e:
if "linkml" in str(e):
pass # this is OK because linkml is not always installed
else:
raise e
except Exception:
print(traceback.format_exc())
FAILURES += 1
Expand Down
Loading
Loading