Skip to content

Commit

Permalink
Refactor bsb to bsb-core and plugins, to break cyclical dependenc…
Browse files Browse the repository at this point in the history
…ies (#766)

* added `get_placement_set` to sim cell

* Replaced `simulation` boot assignment with property

* Cell redesign

* linting

* added more object loading type handlers

* use improved object loader

* wip adapter

* wip, recommit later

* Morphology pipeline can store end result

* give underlying cast error

* Fixed error with `has_own_init` checking incorrect class

* Requirement handlers can check if the node data is written in shortform

* Preferably parse swc with inhouse swc parser

* Added NameSelector shorthand

* Fixed unserializable `FileDependency` attributes

* Implemented current_clamp

* Fixed pipeline calls

* Added morphology creation from swc data

* Refactored current_clamp to v4

* Fixed args passed to iclamp

* Added stable morpho comparison and allowed indicesless MorphologySet

* Removed useless print commands

* removed dead code

* Fixed failing CLI tests inside of PyCharm

* composed NestConnection nodes

* fixed `class_`

* Prepared configs for v4 nest sims

* removed trailing dots

* object/class/function type handler fixes

* current clamp import fixes

* removed some invalid `relay` occurences

* removed cerebellum related postprocessing code

* added `cell_types` to Relay postprocessing

* tag cell model instances with id and pos

* Refactoring of devices and Targetting; Fixed bugs.

- Refactoring of current_clamp, spike_generator and voltage_recorder
- Added LabelTargetting
- Refactoring of SphericalTargetting and SphericalTargetting
- Fixed some bugs in targetting.py, adapter.py and connection.py
- Added __lt__ operator in TransceiverModel class

* improved defaults and cleaned up imports of spike generator

* allowed synapse spec shortform

* Added `output` option for run simulation command

* Added `output` option for run simulation command

* updated SimulationResult dataflow

* linted NestCell

* Added connections to the NEST adapter

* NEST adapter capable of setting up sims, results missing

* fixed `chunk_iter`

* added nodecollection recorder shortcut

* implement devices

* Set up base nest devices

* component cleanup

* default static synapse

* catch missing models & connect error

* fixed None values messing with SLI

* fixed None values messing with SLI

* Use empty NodeCollections for empty datasets

* wip device

* wip device

* cleanup MPI stuff from simulation

* connect conns one to one

* tqdm the connection progress

* Predict best iteration strategy according to available memory

* Check if synapse model exists

* Store a lazy version of the synapse collection

* Added NEST install to GHA

fixed args
fixed path?
added NEST version variable
fixed `python-path`
debug paths
install NEST
install cython & cmake
install under predictable PATH and use `nest_vars` to rig PyNEST

* wip nest test

* removed stray code

* added the simulation that's under consideration to  `post_prepare` hooks

* `post_prepare` hooks are functions

* `GetStatus` is deprecated

* Fixed LazySynapseCollection's magic methods and collection attr

* Created a first test to validate the NEST adapter

* Factored out treeing of values, use it for dict and list

* fix docbuild

* skip gif_pop test under mpi

* try abspath for nest cache

* set default model to `iaf_psc_alpha`

* added brunel test network

* removed some unused references

* wip targetting refactoring

* fixed nest install path on GHA

* fixed syntax error

* removed missing unused nonlocal simdata reference

* huh?

* try some debug stuff after source diving

* more debug

* apt failure

* fix for path validation error?

* (re)moved some code

* refactored targetting

* added basic external nest device

* removed GHA debug code

* replaced apt-get with apt

* added `placement` dict to simdata

* use placement simdata as source of PS

* refactored targetting

* Fix decorated super calls in node classes

* fixed extension based sorted

* add possibility to mark a parser error as a user error

* raise a user error when `$import` is used incorrectly

* disable simulation reporting until #719 is fixed

* fixed some targetting errors

* eager load the nest devices

* added missing `.devices` to simdata

* Added `spike_recorder` and `poisson_generator` devices to nest adapter

* use devices in brunel

* catch missing module option `profiling` error in broken installs

* Fix subtle plotting bug with cubic and swapaxes interaction (#722)

* improved .. clarity .. of ununderstandable module

* added devices dict

* lost changes

* carry over `_bsb_entry_points` to subpieces of simulator plugins

* it's a dict

* fill in simulation slots

* add Brunel network with BSB generated connectivity

* wip rewrite adapter

* more wip

* wip arbor cfg

* global import

* rewriting adapter using brunel template

* make `get_chunk_stats` public PS api

* pass along chunks arg

* add lif cells for brunel

* components can be abc now

* change the ConnectionModel relation from connection type to tag. closes #693

Will need to go over NEST/NEURON for breaking changes

* fix ReceiverCollection and Receiver

* move connection code to connection model

* fix references to gid manager

* update simulation and result code

* removed unused cell code

* change label format

* add resolution

* tidbits

* change default iterator to iterate single conns

* add connection models to arbor brunel

* add receiver collection polymorphism

* init index

* kind of devices

* Updated neuron devices to bsb v4
- current clamp, spike generator,  voltage recorder, voltage clamp
- Added LabelTargetting
- Added __lt__ method in TransceiverModel

* Added tags attribute in MorphologyDependencyNode

* Bug fixes.
- Added __lt__ method in TransceiverModel
- NeuronCell instances saves id and positions
- Renamed instance.model to instance.cell_model

* Synapse names are retrived from MechAccessor var

* If a `default` dynamic value is given, attr should not be required

* bump to bsb-hdf5 version that fixed `load_ids`

* test load_ids with individual chunk

* Added neuron unittests

* Fixed create_transmitters and _allocate_transmitters
- create_transmitters: Now the id of the cells in simdata are mapped through simdata.transmap and the adapter works with multithreads.
-  _allocate_transmitters: fixed gid
- Fixed a report message

* black fix

* test GHA run with dbbs-models

* fix engine references

* fix property assignment

* raise error for using arborize models with missing morphos

* pass on synapse spec weight and delay to `insert_receiver`

* fix

* fix default targetting of nest devices

* removed any reference to `default_neuron/synapse_model`

* use tag if set

* rename `neuron_model` to `model`

* removed outdated info

* adapt docstr

* declare scaffold attribute

* fix arbor mpi logic

* fix 0.9 updated function signature

* fix absolute import

* catch lif parameter errors

* fix lif parameter error

* removed unused `DeviceConnectionModel` class

* change `prepare_samples` signature

* add spike_recorder to arbor devices

* change device import trigger not to pollute module

* test arbor brunel

* type hint simdata populations attr

* store offset on populations

* factor out `sort_models`, add `lookup_offset`

* rename `gap_' to `gap_junction`

* pass the populations to `create_connections_*` for global offset

* renamed `test` to `test_nest` and fixed `neuron_model` occurences

* switch to de facto output

* use NEST for `test_simulate`, since it is built from source in tests

* fix missing arbor dep for tests

* if not a str, assume it is a manually set object

* fix numpy deprecation

* post_prepare's are functions

* cleanup test file

* format test config

* fix voltage recorder targetting

* Auto stash before merge of "feature/simulation" and "origin/feature/simulation"

* fix py3.9 callable syntax

* fix mixin reflist resolution

* WIP adapter

* updated test configs

* update or disable NEURON devices

* switch some tests over to RandomStorageFixture

* add option to setup during class init for RandomStorageFixture

* switch to f string

* factor out SimulationData

* update and disable NEURON tests

* shield bsb-arbor with a mocked import

* add `Configuration` type hint

* made resolution a required attribute

* add spiketrains and analogsignals shortcuts

* fix prev commit

* add iaf_cond_alpha test

* added missing resolution attributes

* Use `bsb-core` as PyPI name, to install the framework `bsb` backbone

* moved `nest` to new `bsb-nest` repo

* move neuron to bsb-neuron

* moved arbor to bsb-arbor

* moved json parser to bsb-json

* moved unittest folder to bsb-test

* replaced bsb.unittest import

* fix tests

* fix remaining tests, add up(one) shortcut

* fix right handed notation. closes #672

* b0! Fixed tests

* rem plugins.txt

* set test requirements in pyproject.toml

* fix bsb.config.parsers refs in docs

* do not install NEST for bsb-core CI

* fix tests

* more ci fixes

* oops

* fix packing warning

* swap yz

* swap chunk yz

* fix mpi tests

* fix 3.12 assertion

* fixed docs

---------

Co-authored-by: alessiomarta <81438612+alessiomarta@users.noreply.github.com>
Co-authored-by: Francesco Sheiban <frances.j.shei@gmail.com>
Co-authored-by: Lennart Landsmeer <lennart@landsmeer.email>
Co-authored-by: alessio <alessio.marta@gmail.com>
  • Loading branch information
5 people committed Nov 8, 2023
1 parent d8f4fa3 commit 7eb5d69
Show file tree
Hide file tree
Showing 157 changed files with 632 additions and 38,028 deletions.
10 changes: 10 additions & 0 deletions .github/devops/install-nest.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
git clone https://github.com/nest/nest-simulator $GITHUB_WORKSPACE/nest
cd $GITHUB_WORKSPACE/nest
git checkout tags/v$1
mkdir build
cd build
pip install cython cmake
cmake .. \
-DCMAKE_INSTALL_PREFIX=$2 \
-Dwith-mpi=ON
make install
40 changes: 14 additions & 26 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,42 +11,30 @@ jobs:
python-version: ["3.9", "3.10", "3.11", "3.12"]
steps:
- uses: actions/checkout@v3.5.0

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4.3.0
with:
python-version: ${{ matrix.python-version }}

- name: Install apt dependencies
run: |
sudo apt-get update
sudo apt-get install openmpi-bin libopenmpi-dev libhdf5-dev
- name: Cache pip
uses: actions/cache@v3
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
${{ runner.os }}-
sudo apt update
# Install `libopenmpi` for MPI
sudo apt install openmpi-bin libopenmpi-dev
# Install `libhdf5` for `morphio`
sudo apt install libhdf5-dev
- name: Install dependencies & self
run: |
python -m pip install --upgrade pip
pip install wheel
# Install the regular requirements
pip install -r requirements.txt --prefer-binary
# Install the default plugins, without dependencies, to avoid circular dependency
# conflicts with the BSB itself
pip install -r plugins.txt --prefer-binary --no-deps
# Install the BSB from the repo without dependencies, already installed above.
pip install -e . --no-deps
# Install latest pip
pip install --upgrade pip
# Install bsb-core
pip install .[test,mpi]
- name: Run tests & coverage
run: |
coverage run -p -m unittest discover -v -s ./tests
mpiexec -n 2 coverage run -p -m unittest discover -v -s ./tests
BSB_PROFILING=TRUE coverage run -p -m unittest tests.test_env_options
bash <(curl -s https://codecov.io/bash)
- name: Test default plugins
run: |
git clone https://github.com/dbbs-lab/bsb-hdf5
pip install -e bsb-hdf5/
coverage run -p -m unittest discover bsb-hdf5/test
bash <(curl -s https://codecov.io/bash)
bash <(curl -s https://codecov.io/bash)
4 changes: 2 additions & 2 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,8 @@ jobs:
python-version: 3.9
- name: Install MPI
run: |
sudo apt-get update
sudo apt-get install -y openmpi-bin libopenmpi-dev
sudo apt update
sudo apt install -y openmpi-bin libopenmpi-dev
- name: Install dependencies
run: |
python -m pip install --upgrade pip
Expand Down
14 changes: 11 additions & 3 deletions bsb/__init__.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,13 @@
"""
A component framework for multiscale bottom-up neural modelling.
`bsb-core` is the backbone package contain the essential code of the BSB: A component
framework for multiscale bottom-up neural modelling.
`bsb-core` needs to be installed alongside a bundle of desired bsb plugins, some of
which are essential for `bsb-core` to function. First time users are recommended to
install the `bsb` package instead.
"""

__version__ = "4.0.0a56"
__version__ = "4.0.0b0"

import functools

Expand All @@ -20,7 +25,10 @@ def _register(self, cls, method=None): # pragma: nocover

functools.singledispatchmethod.register = _register

from .options import profiling as _pr
try:
from .options import profiling as _pr
except Exception:
_pr = False

if _pr:
from .profiling import activate_session
Expand Down
3 changes: 2 additions & 1 deletion bsb/cli/commands/_commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ def handler(self, context):
append += ", ".join(f"'{name}'" for name in extra_simulations.keys())
errr.wrap(type(e), e, append=append)
else:
result.write(f"{uuid4()}.nio", "ow")
result.write(getattr(context.arguments, "output", f"{uuid4()}.nio"), "ow")

def get_options(self):
return {
Expand All @@ -220,6 +220,7 @@ def get_options(self):
def add_parser_arguments(self, parser):
parser.add_argument("network")
parser.add_argument("simulation")
parser.add_argument("-o", "--output")


class CacheCommand(BaseCommand, name="cache"): # pragma: nocover
Expand Down
13 changes: 11 additions & 2 deletions bsb/config/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
import sys
import glob
import itertools
import typing
from importlib.machinery import ModuleSpec
from shutil import copy2 as copy_file
import builtins
Expand Down Expand Up @@ -45,6 +46,12 @@
_path = __path__
ConfigurationAttribute.__module__ = __name__

if typing.TYPE_CHECKING:
from ._config import Configuration

# Add some static type hinting, to help tools figure out this dynamic module
Configuration: "Configuration"


# ConfigurationModule should not inherit from `ModuleType`, otherwise Sphinx doesn't
# document all the properties.
Expand Down Expand Up @@ -213,14 +220,16 @@ def _try_parsers(content, classes, ext=None, path=None): # pragma: nocover
if ext is not None:

def file_has_parser_ext(kv):
return ext in getattr(kv[1], "data_extensions", ())
return ext not in getattr(kv[1], "data_extensions", ())

classes = builtins.dict(sorted(classes.items(), key=file_has_parser_ext))
exc = {}
for name, cls in classes.items():
try:
tree, meta = cls().parse(content, path=path)
except Exception as e:
if getattr(e, "_bsbparser_show_user", False):
raise e from None
exc[name] = e
else:
return (name, tree, meta)
Expand Down Expand Up @@ -262,7 +271,7 @@ def parser_method(self, file=None, data=None, path=None):
return _from_parsed(self, name, tree, meta, file)

parser_method.__name__ = "from_" + name
parser_method.__doc__ = _parser_method_docs(parser)
# parser_method.__doc__ = _parser_method_docs(parser)
return parser_method


Expand Down
19 changes: 11 additions & 8 deletions bsb/config/_attrs.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ class Example:
:param kwargs: All keyword arguments are passed to the constructor of the
:func:`attribute <.config.attr>`.
"""
if "required" not in kwargs:
if "required" not in kwargs and "default" not in kwargs:
kwargs["required"] = True
if "type" not in kwargs:
kwargs["type"] = str
Expand Down Expand Up @@ -508,15 +508,18 @@ def is_node_type(self):

def tree(self, instance):
val = _getattr(instance, self.attr_name)
return self.tree_of(val)

def tree_of(self, value):
# Allow subnodes and other class values to convert themselves to their tree
# representation
if hasattr(val, "__tree__"):
val = val.__tree__()
if hasattr(value, "__tree__"):
value = value.__tree__()
# Check if the type handler specifies any inversion function to convert tree
# values back to how they were found in the document.
if hasattr(self.type, "__inv__") and val is not None:
val = self.type.__inv__(val)
return val
if hasattr(self.type, "__inv__") and value is not None:
value = self.type.__inv__(value)
return value

def flag_dirty(self, instance):
instance._config_state[self.attr_name] = False
Expand Down Expand Up @@ -679,7 +682,7 @@ def _set_type(self, type, key=None):

def tree(self, instance):
val = _getattr(instance, self.attr_name)
return [e if not hasattr(e, "__tree__") else e.__tree__() for e in val]
return [self.tree_of(e) for e in val]


class cfgdict(builtins.dict):
Expand Down Expand Up @@ -823,7 +826,7 @@ def _set_type(self, type, key=None):

def tree(self, instance):
val = _getattr(instance, self.attr_name).items()
return {k: v if not hasattr(v, "__tree__") else v.__tree__() for k, v in val}
return {k: self.tree_of(v) for k, v in val}


class ConfigurationReferenceAttribute(ConfigurationAttribute):
Expand Down
33 changes: 19 additions & 14 deletions bsb/config/_make.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ def make_metaclass(cls):
# The metaclass makes it so that there are 3 overloaded constructor forms:
#
# MyNode({ <config dict values> })
# MyNode(config="dict", values="here")
# MyNode(example="attr", values="here")
# ParentNode(me=MyNode(...))
#
# The third makes it that type handling and other types of casting opt out early
Expand Down Expand Up @@ -132,9 +132,7 @@ def compile_class(cls):
del cls_dict["__weakref__"]
ncls = make_metaclass(cls)(cls.__name__, cls.__bases__, cls_dict)
for method in ncls.__dict__.values():
cl = getattr(method, "__closure__", None)
if cl and cl[0].cell_contents is cls:
cl[0].cell_contents = ncls
_replace_closure_cells(method, cls, ncls)

# Shitty hack, for some reason I couldn't find a way to override the first argument
# of `__init_subclass__` methods, that would otherwise work on other classmethods,
Expand All @@ -158,6 +156,15 @@ def compile_class(cls):
return ncls


def _replace_closure_cells(method, old, new):
cl = getattr(method, "__closure__", None) or []
for cell in cl:
if cell.cell_contents is old:
cell.cell_contents = new
elif inspect.isfunction(cell.cell_contents):
_replace_closure_cells(cell.cell_contents, old, new)


def compile_isc(node_cls, dynamic_config):
if not dynamic_config or not dynamic_config.auto_classmap:
return node_cls.__init_subclass__
Expand Down Expand Up @@ -349,7 +356,7 @@ def get_config_attributes(cls):
if hasattr(p_cls, "_config_attrs"):
attrs.update(p_cls._config_attrs)
else:
# Add mixin config attributes
# Scrape for mixin config attributes
from ._attrs import ConfigurationAttribute

attrs.update(
Expand Down Expand Up @@ -442,9 +449,7 @@ def _get_dynamic_class(node_cls, kwargs):
except DynamicClassError:
mapped_class_msg = _get_mapped_class_msg(loaded_cls_name, classmap)
raise UnresolvedClassCastError(
"Could not resolve '{}'{} to a class.".format(
loaded_cls_name, mapped_class_msg
)
f"Could not resolve '{loaded_cls_name}'{mapped_class_msg} to a class"
) from None
return dynamic_cls

Expand Down Expand Up @@ -586,12 +591,12 @@ def walk_node_attributes(node):
:returns: attribute, node, parents
:rtype: Tuple[:class:`~.config.ConfigurationAttribute`, Any, Tuple]
"""
if hasattr(node.__class__, "_config_attrs"):
attrs = node.__class__._config_attrs
elif hasattr(node, "_config_attr"):
attrs = _get_walkable_iterator(node)
else:
return
attrs = get_config_attributes(node)
if not attrs:
if hasattr(node, "_config_attr"):
attrs = _get_walkable_iterator(node)
else:
return
for attr in attrs.values():
yield node, attr
# Yield but don't follow references.
Expand Down
2 changes: 0 additions & 2 deletions bsb/config/parsers/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,4 @@
from ._parser import Parser
from .json import JsonParser
from .yaml import YAMLParser


def get_parser_classes():
Expand Down

0 comments on commit 7eb5d69

Please sign in to comment.