Skip to content

Commit

Permalink
optimization -> optimization (#255)
Browse files Browse the repository at this point in the history
* update

* merger

* first incremental pull request

* basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

* Removing duplicate jNeuroMLBackend import (#251)

* Update Unit Test Cases and Other Improvements (#249)

* Add test_get_files

* Update .travis.yml

* Drop Python 2 support

* Minor update

* Update dependency requirenments.

* Update .travis.yml

* Change allensdk version to 0.16.3 (#1)

* Make it to be setup.cfg-only project

* Make it to be setup.cfg-only project 2

* Update unit tests.

* Update test cases

* fix

* Update parameter

* Update test cases

* Drop support of Python 2

* Import new test cases in __init__.py in unit_test directory

* remove PYTHON_MAJOR_VERSION constant

* Fix error

* Update dependency

* Drop Python 2 support

* Update bluepyopt

* Update unit tests

* Update unit tests

* Improved logic in url_to_path method

* Update unit tests

* Update unit tests

* Update unit tests

* Update unit tests

* Update unit tests

* Update unit tests

* forceobj = true for jit decorator

* add test_geppetto_backend

* Update unit tests

* Requiring Python version >= 3.5

* Clean up `__future__ import something`

* Import deepcopy, and improve coding style.

* Import available_backends

* Make ExternalModel inherits from RunnableModel instead of Model

* Fix warning

* Make ExternalModel call constructor of the parent class

* Improve ReducedModel, make unit test cases for it.

* get pynn and pyneuroml from Github

* Update unit tests

* Update unit tests

* Try to fix a shell command

* Update unit tests

* Update unit tests

* Update unit tests

* Delete useless code

* Update address of BBP microcircuit portal. Add test cases for bbp.py

* Ran bbp.ipynb

* Update unit tests

* Update unit tests

* Update unit tests

* Update unit tests

* Change sciunit.settings to sciunit.config_set

* update

* merger

first incremental pull request

basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

adding in continuous integration

update

* update circle

* refactor code

* continuous integration plus coverage related deletions

* Removing duplicate jNeuroMLBackend import (#251)

update

merger

first incremental pull request

basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

basic NeuronUnit dev fork from scidash with minimalist changes to support multispiking optimization

adding in continuous integration

update

update circle

refactor code

continuous integration plus coverage related deletions

* resolved merge

* rebuild and squash circle ci

* test NU

* test NU sqaush

* squash circle

* circle squash

* squash circle

* better chance of working

* squash circle

* circle squash

* circle squash

* squash circle

* circle squash

* circle squash

* circle squash

* squash circle ci

* rebase confusion

* Added StaticBackend

* resolve rebase

* Removing duplicate jNeuroMLBackend import (#251)

* fixed cannot be safely interpretated as integer linspace

* merge

* Added StaticBackend

* fixup! Added StaticBackend

* Removing duplicate jNeuroMLBackend import (#251)

* Fix mistake in Izhikevich equation markdown

* merge

* plotly added

* plotly added

* circle squash

* squash circle

* squash circle

* squash circle

* squash circle

* squash circle

* modified requirements

* circle squash

* moved allen api from BPO to NU

* refactor

* circle squash

* circle squash

* circle squash

* redirectory circle squash

* will this work

* circle squash

* circle squash

* circle squash

* circle squash

* circle ci squash

* perhaps fix minor travis annoyance on scidash

* try to make scidash travis work too

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* replace some unit testing files

* adding back in important looking tests

* circle squash

* clean up

* brutal clean up but unit tests work

* brutal clean up but unit tests work

* brutal clean up but unit tests work

* easier target

* easier target

* travis squash

* travis squash

* Update README.md

* travis might work now

* setter property methods added to adexp model

* circle ci squash

* better method stacking and encapsulation, removed redundancy

* fix circle squash

* circle squash

* travis circle squash

* circle travis squash

* travis update

* circle squash

* updates before checkout

* delete silly bug

* unit test for rick to check

* unit tests built in to continuous integration for relative difference sciunit debug

* unit tests built in to continuous integration for relative difference sciunit debug

* unit tests built in to continuous integration for relative difference sciunit debug

* unit tests built in to continuous integration for relative difference sciunit debug

* graphical unit test made

* graphical unit test made

* both Relative difference and ZScore working now

* after change to BPO source code where I remove special treatment for models not ADEXP in BPO

* continuous integration updated

* before merge

* before merge branch

* perhaps branch fixed now

* merge into merge

* meaningless

* simplified best model

* now ephys properties and multispiking optimize, as well as allen examples

* better integration of unit testing

* introduced some typing to optimization management, and some code comments, reduced opt_man file size by 50%

* added back in neuroelectro api

* fixing unit tests for neuroelectro

* removed erroneous path from travis build

* removed erroneous path from travis build

* renamed get_neab neuroelectro_api added in more typing and documentation tried to fix broken import paths

* aggressive typing probably broke some methods

* aggressive typing probably broke some methods

* fixed typing issues

* run black over everything

* fix bug caused by refactor of efel_evaluation revealed in continuous integration

* fix typing bug caused by refactor revealed in continuous integration

* fix typing bug caused by refactor revealed in continuous integration

* fix typing bug caused by refactor revealed in continuous integration

* fixed small bug in constructing neuronunit tests from allen data and neuronunit static models

* updated travis script

* shortened CI tests to avoid timeouts

* made a score obs prediction reformatter in data transport container, did more typing, removed more unnecessary methods used more inheritance in BPO

* reduced cell optimization

* almost ready for pull request take two

* changes

* files changed

* fix skip decorators so that CI works again

* changed circle ci config file to point to BPO circle-ci-branch

* gentle refactor and typing

* ran black over everything

* ran black over everything

* typing accident fixup

* fix ci requirement accident

* fixing ci dependency issue

* merge circle ci

* merge circle ci

* fixed None return type

* refactor unit testing return type

* refactored unit tests

* fixed tab spaces issue

* unit test refactor

* update unit test for refactor rheobase_dtc_tests

* refactor small test

* refactor target current into method

* fixed new method missing argument

* fixed new method missing argument

* fix

* update code

* passing travis tests

* update content

* speed up travis ci unit tests

* speed up travis ci unit tests

* make opt work passable on shorter test duration ci

* reduce travis burden

* reduce travis burden

* make more unit tests pass

* clean up for passing more unit tests, especially import tests

* overall cleaned up unit tests, this commmit represents greater test passing than dev branch

* overall cleaned up unit tests, this commmit represents greater test passing than dev branch

* update for PR

* update for PR

* applied black to all files again

* Update README.md

* izhi optimization slowed down checking out what went wrong

* very effected by mutpb, eta, cxptb

* very effected by mutpb, eta, cxptb

* push changes to ci

* update

* update

* last commit before going backwards through reflog

* make stale branch functional again

* Jan 28th end of day

* found new problems with dtc/model param over ride in dtc class, made variance explained error possible, identified conceptually that brute force is necessary to optimize, made it so that algorithms has functional time diminishing eta

* factored out redundant rheobase seeking method

* added in some comments

* improved documentation

* refactor optimization_management documentation improvements simplify return value of functions

* removed backends depricated/not supported

* removed backends depricated/not supported

* refactor optimization_management documentation improvements simplify return value of functions

* refactor optimization_management documentation improvements simplify return value of functions

* fixing rheobase solving management code

* applied black updated methods called in Allen API

* applied black to unit test directory, and made it so recursively importing eveything should work in theory over CI

* update code for passing CI

* ran black over all neuronunit test files

* rheobase test on CI

* rheobase test on CI

* update ci

* elitism in bpo via neuronunit flag in constructor

* ci will probably work again now

* ci fix

* update ci unit test passing

* shorter build

* turns out the right efel package is important

* updated travis ci build

* update travis build

Co-authored-by: Mark Watts <mwatts15@users.noreply.github.com>
Co-authored-by: Zhiwei <zhi.wei.liang@outlook.com>
Co-authored-by: Russell Jarvis <russelljarvis@protonmail.com>
Co-authored-by: Richard Gerkin <rgerkin@asu.edu>
  • Loading branch information
5 people committed Feb 5, 2021
1 parent 701c33a commit e9d41e4
Show file tree
Hide file tree
Showing 130 changed files with 8,706 additions and 78,154 deletions.
59 changes: 59 additions & 0 deletions .circleci/config.yml
@@ -0,0 +1,59 @@
defaults: &defaults
working_directory: ~/markovmodel/PyEMMA
docker:
- image: continuumio/miniconda3

inst_conda_bld: &inst_conda_bld
- run: conda config --add channels conda-forge
- run: conda config --set always_yes true
- run: conda config --set quiet true
- run: conda install conda-build

version: 2

jobs:
build:
<<: *defaults
parallelism: 1
steps:
- checkout
- run: git fetch --unshallow || true
- run: apt-get install -y cpp gcc
- run: apt-get install -y libx11-6 python-dev git build-essential
- run: apt-get install -y autoconf automake gcc g++ make gfortran
- run: apt-get install -y python-tables
- run: apt-get install -y libhdf5-serial-dev

- run: conda config --add channels conda-forge
- run: conda config --set always_yes true
- run: conda config --set quiet true
- run: conda install conda-build
- run: pip install pip --upgrade;
- run: conda install numpy;
- run: conda install numba;
- run: conda install dask;
- run: pip install tables
- run: pip install scipy==1.5.4
- run: pip install coverage
- run: pip install cython
- run: pip install asciiplotlib;
- run: pip install ipfx
- run: pip install streamlit
- run: pip install sklearn
- run: pip install seaborn
- run: pip install frozendict
#- run: pip install plotly
- run: pip install allensdk==0.16.3
- run: pip install --upgrade colorama
- run: pip install -e .
- run: rm -rf /opt/conda/lib/python3.8/site-packages/sciunit
- run: git clone -b neuronunit https://github.com/russelljjarvis/jit_hub.git
- run: cd jit_hub; pip install -e .; cd ..;
- run: git clone -b neuronunit_reduced_cells https://github.com/russelljjarvis/BluePyOpt.git
- run: cd BluePyOpt; pip install -e .
- run: git clone -b dev https://github.com/russelljjarvis/sciunit.git

- run: cd sciunit; pip install -e .; cd ..;
- run: pip install git+https://github.com/russelljjarvis/eFEL
- run: cd neuronunit/unit_test; python rheobase_dtc_test.py
- run: cd neuronunit/unit_test; python -m unittest scores_unit_test.py
8 changes: 7 additions & 1 deletion .travis.yml
Expand Up @@ -21,13 +21,19 @@ install:
- conda info -a
- pip install -U pip
- pip install .
- pip install sklearn
- pip install seaborn
- pip install coveralls
- pip install pylmeasure # required by morphology tests
- sh build.sh

######################################################

script:
- export NC_HOME='.' # NeuroConstruct isn't used but tests need this
# variable set to pass.
- sh test.sh
- cd neuronunit/unit_test; python -m unittest scores_unit_test.py; cd -;
- cd neuronunit/unit_test; python -m unittest rheobase_dtc_test.py; cd -;
#- sh test.sh
after_success:
- coveralls
4 changes: 4 additions & 0 deletions README.md
@@ -1,3 +1,7 @@
### Circle CI russelljjarvis/optimization build:
[![Build Status](https://circleci.com/gh/russelljjarvis/neuronunit/tree/optimization.svg?style=svg)](https://app.circleci.com/pipelines/github/russelljjarvis/neuronunit/)
### Travis CI scidash/optimization build:
[![Travis](https://travis-ci.org/scidash/neuronunit.svg?branch=optimization)](https://travis-ci.org/scidash/neuronunit?branch=optimization)
| Master | Dev |
| ------------- | ------------- |
| [![Travis](https://travis-ci.org/scidash/neuronunit.svg?branch=master)](https://travis-ci.org/scidash/neuronunit) | [![Travis](https://travis-ci.org/scidash/neuronunit.svg?branch=dev)](https://travis-ci.org/scidash/neuronunit) |
Expand Down
31 changes: 31 additions & 0 deletions build.sh
@@ -0,0 +1,31 @@
apt-get install -y cpp gcc
apt-get install -y libx11-6 python-dev git build-essential
apt-get install -y autoconf automake gcc g++ make gfortran
apt-get install -y python-tables
apt-get install -y libhdf5-serial-dev
conda install numpy;
conda install numba;
conda install dask;
pip install pip --upgrade;
pip install tables
pip install scipy==1.5.4
pip install -e .
pip install coverage
git clone -b neuronunit https://github.com/russelljjarvis/jit_hub.git
cd jit_hub; pip install -e .; cd ..;
pip install cython
pip install asciiplotlib;
git clone -b neuronunit_reduced_cells https://github.com/russelljjarvis/BluePyOpt.git
cd BluePyOpt; pip install -e .
pip install git+https://github.com/russelljjarvis/eFEL
pip install ipfx
pip install streamlit
pip install sklearn
pip install seaborn
pip install frozendict
pip install plotly
pip install --upgrade colorama
rm -rf /opt/conda/lib/python3.8/site-packages/sciunit
git clone -b dev https://github.com/russelljjarvis/sciunit.git
cd sciunit; pip install -e .; cd ..;
pip install allensdk==0.16.3
9 changes: 9 additions & 0 deletions codecov.yml
@@ -0,0 +1,9 @@
coverage:
range: "90...100"

status:
project:
default:
target: "90%"
threshold: "5%"
patch: false
16 changes: 12 additions & 4 deletions environment.yml
Expand Up @@ -6,7 +6,15 @@ dependencies:
- pip:
- neo==0.4
- elephant
- scoop
- git+http://github.com/scidash/sciunit@dev#egg=sciunit-1.5.6
- git+http://github.com/rgerkin/AllenSDK@python3.5#egg=allensdk-0.12.4.1
- git+http://github.com/rgerkin/pyNeuroML@master#egg=pyneuroml-0.2.3
- dask
- numba
- streamlit
- sklearn
- seaborn
- frozendict
- plotly
- asciiplotlib
- ipfx
- git+https://github.com/russelljjarvis/jit_jub@neuronunit
- git+https://github.com/russelljjarvis/BluePyOpt@neuronunit_reduced_cells
- git+https://github.com/russelljjarvis/sciunit@dev
3 changes: 3 additions & 0 deletions neuronunit/allenapi/__init__.py
@@ -0,0 +1,3 @@
"""Allen API for NeuronUnit"""

import warnings
238 changes: 238 additions & 0 deletions neuronunit/allenapi/aibs.py
@@ -0,0 +1,238 @@
"""NeuronUnit module for interaction with the Allen Brain Insitute
Cell Types database"""
# import logging
# logger = logging.getLogger(name)
# logging.info("test")
import matplotlib as mpl

try:
mpl.use("agg")
except:
pass
import matplotlib.pyplot as plt
import shelve
import requests
import numpy as np
import quantities as pq
from allensdk.api.queries.cell_types_api import CellTypesApi
from allensdk.core.cell_types_cache import CellTypesCache
from allensdk.api.queries.glif_api import GlifApi
import os
import pickle
from allensdk.api.queries.biophysical_api import BiophysicalApi
from neuronunit.optimization.data_transport_container import DataTC

# from allensdk.model.glif.glif_neuron import GlifNeuron

from allensdk.core.cell_types_cache import CellTypesCache
from allensdk.ephys.extract_cell_features import extract_cell_features
from collections import defaultdict
from allensdk.core.nwb_data_set import NwbDataSet

from neuronunit import models
from neo.core import AnalogSignal
import quantities as qt
from types import MethodType

from allensdk.ephys.extract_cell_features import extract_cell_features
from collections import defaultdict
from allensdk.core.cell_types_cache import CellTypesCache

import neo
from elephant.spike_train_generation import threshold_detection
from quantities import mV, ms
from numba import jit
import sciunit
import math
import pdb
from allensdk.ephys.extract_cell_features import extract_cell_features


def is_aibs_up():
"""Check whether the AIBS Cell Types Database API is working."""
url = (
"http://api.brain-map.org/api/v2/data/query.xml?criteria=model"
"::Specimen,rma::criteria,[id$eq320654829],rma::include,"
"ephys_result(well_known_files(well_known_file_type"
"[name$eqNWBDownload]))"
)
request = requests.get(url)
return request.status_code == 200


def get_observation(dataset_id, kind, cached=True, quiet=False):
"""Get an observation.
Get an observation of kind 'kind' from the dataset with id 'dataset_id'.
optionally using the cached value retrieved previously.
"""

db = shelve.open("aibs-cache") if cached else {}
identifier = "%d_%s" % (dataset_id, kind)
if identifier in db:
print(
"Getting %s cached data value for from AIBS dataset %s"
% (kind.title(), dataset_id)
)
value = db[identifier]
else:
print(
"Getting %s data value for from AIBS dataset %s"
% (kind.title(), dataset_id)
)
ct = CellTypesApi()
cmd = ct.get_cell(dataset_id) # Cell metadata

if kind == "rheobase":
if "ephys_features" in cmd:
value = cmd["ephys_features"][0]["threshold_i_long_square"] # newer API
else:
value = cmd["ef__threshold_i_long_square"] # older API

value = np.round(value, 2) # Round to nearest hundredth of a pA.
value *= pq.pA # Apply units.

else:
value = cmd[kind]

db[identifier] = value

if cached:
db.close()
return {"value": value}


def get_value_dict(experiment_params, sweep_ids, kind):
"""Get a dictionary of data values from the experiment.
A candidate method for replacing 'get_observation'.
This fix is necessary due to changes in the allensdk.
Warning: Together with 'get_sp' this method may not properly
convey the meaning of 'get_observation'.
"""

if kind == str("rheobase"):
sp = get_sp(experiment_params, sweep_ids)
value = sp["stimulus_absolute_amplitude"]
value = np.round(value, 2) # Round to nearest hundredth of a pA.
value *= pq.pA # Apply units.
return {"value": value}


"""Auxiliary helper functions for analysis of spiking."""


def find_nearest(array, value):
array = np.asarray(array)
idx = (np.abs(array - value)).argmin()
return (array[idx], idx)


def inject_square_current(model, current):
if type(current) is type({}):
current = float(current["amplitude"])
data_set = model.data_set
numbers = data_set.get_sweep_numbers()
injections = [np.max(data_set.get_sweep(sn)["stimulus"]) for sn in numbers]
sns = [sn for sn in numbers]
(nearest, idx) = find_nearest(injections, current)
index = np.asarray(numbers)[idx]
sweep_data = data_set.get_sweep(index)
temp_vm = sweep_data["response"]
injection = sweep_data["stimulus"]
sampling_rate = sweep_data["sampling_rate"]
vm = AnalogSignal(temp_vm, sampling_rate=sampling_rate * qt.Hz, units=qt.V)
model._vm = vm
return model._vm


def get_membrane_potential(model):
return model._vm


def get_spike_train(vm, threshold=0.0 * mV):
"""
Inputs:
vm: a neo.core.AnalogSignal corresponding to a membrane potential trace.
threshold: the value (in mV) above which vm has to cross for there
to be a spike. Scalar float.
Returns:
a neo.core.SpikeTrain containing the times of spikes.
"""
spike_train = threshold_detection(vm, threshold=threshold)
return spike_train


def get_spike_count(model):
vm = model.get_membrane_potential()
train = get_spike_train(vm)
return len(train)


def appropriate_features():
for s in sweeps:
if s["ramp"]:
print([(k, v) for k, v in s.items()])
current = {}
current["amplitude"] = s["stimulus_absolute_amplitude"]
current["duration"] = s["stimulus_duration"]
current["delay"] = s["stimulus_start_time"]


def get_features(specimen_id=485909730):
data_set = ctc.get_ephys_data(specimen_id)
sweeps = ctc.get_ephys_sweeps(specimen_id)

# group the sweeps by stimulus
sweep_numbers = defaultdict(list)
for sweep in sweeps:
sweep_numbers[sweep["stimulus_name"]].append(sweep["sweep_number"])

# calculate features
cell_features = extract_cell_features(
data_set,
sweep_numbers["Ramp"],
sweep_numbers["Short Square"],
sweep_numbers["Long Square"],
)


def get_sweep_params(dataset_id, sweep_id):
"""Get sweep parameters.
Get those corresponding to the sweep with id 'sweep_id' from
the dataset with id 'dataset_id'.
"""

ct = CellTypesApi()
experiment_params = ct.get_ephys_sweeps(dataset_id)
sp = None
for sp in experiment_params:
if sp["id"] == sweep_id:
sweep_num = sp["sweep_number"]
if sweep_num is None:
msg = "Sweep with ID %d not found in dataset with ID %d."
raise Exception(msg % (sweep_id, dataset_id))
break
return sp


def get_sp(experiment_params, sweep_ids):

"""Get sweep parameters.
A candidate method for replacing 'get_sweep_params'.
This fix is necessary due to changes in the allensdk.
Warning: This method may not properly convey the original meaning
of 'get_sweep_params'.
"""

sp = None
for sp in experiment_params:
for sweep_id in sweep_ids:
if sp["id"] == sweep_id:
sweep_num = sp["sweep_number"]
if sweep_num is None:
raise Exception("Sweep with ID %d not found." % sweep_id)
break
return sp

0 comments on commit e9d41e4

Please sign in to comment.