Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
259648e
[nixio] Read linked metadata properties into annotations
achilleas-k Sep 22, 2018
9b00815
[nixio] Remove deprecated file backend argument
achilleas-k Jul 18, 2018
7032d68
[nixio] nix.Value removal: New metadata compatibility
achilleas-k Jul 18, 2018
ce09164
[nixio] Inherit properties from linked sections
achilleas-k Sep 23, 2018
92e7164
[nixio] Change filename for NIXRawIO tests
achilleas-k Sep 23, 2018
1087151
[setup] Update nixio minimum version
achilleas-k Sep 23, 2018
9d8906a
[doc] Update min version for nixio in install docs
achilleas-k Sep 23, 2018
4f39a64
[testing] Update circleci config for newer nixio version
achilleas-k Sep 23, 2018
6aeba1b
Implementation of the "raw" Multi Channel System (MCS) IO at rawio le…
samuelgarcia Sep 27, 2018
0ee0e5e
pep8 + file for testing.
samuelgarcia Oct 2, 2018
c9ef6e5
add `Event.to_epoch` method
apdavison Oct 22, 2018
80a5d64
Init intanrawio
samuelgarcia Oct 31, 2018
57816f9
WIP intan
samuelgarcia Nov 5, 2018
139930b
intan RHS WIP
samuelgarcia Nov 5, 2018
98abd5d
WIP RHD files
samuelgarcia Nov 5, 2018
c21bc62
WIP RHD and RHS
samuelgarcia Nov 6, 2018
1e59a9f
Some clean.
samuelgarcia Nov 6, 2018
d75aa5e
add neo.io class for intanio
samuelgarcia Nov 7, 2018
c64a00f
pep8 clean
samuelgarcia Nov 7, 2018
c3832c4
Docstring fixes
apdavison Nov 15, 2018
0b18caf
Merge pull request #577 from samuelgarcia/rawmcsrawio
apdavison Nov 15, 2018
5377c3a
Docstring fixes
apdavison Nov 15, 2018
36eea92
Debug with rhd and files at gin.
samuelgarcia Nov 15, 2018
937946b
Merge pull request #576 from G-Node/nixio-new-metadata
JuliaSprenger Nov 15, 2018
3801df9
Fixed a string to be 'b' so that the decode() function works in Python3
mdenker Nov 15, 2018
05d47fa
Fixed a spelling mistake in the test doku
mdenker Nov 15, 2018
e159957
Doc for axonio read_protocol
samuelgarcia Nov 15, 2018
0930eab
Fix some bugs which I see on my laptop, although apparently they are …
apdavison Nov 16, 2018
e8c8f7e
Introduce additional checks on argument values when creating Epochs; …
apdavison Nov 16, 2018
9eb1ef3
Update docstring
apdavison Nov 16, 2018
c02a20c
Merge pull request #601 from apdavison/array-fixes
samuelgarcia Nov 16, 2018
de8b590
Merge pull request #598 from INM-6/fix/blackrockiopy3
JuliaSprenger Nov 16, 2018
521976a
Merge branch 'arr_anns/test3' of https://github.com/bjoern1001001/pyt…
JuliaSprenger Nov 16, 2018
1a9d2d9
fromstring warnings
samuelgarcia Nov 16, 2018
d296f1d
for regex replace string pattern with raw string r'....' pattern
samuelgarcia Nov 16, 2018
dbdc535
Fixed some problems with tests
apdavison Nov 16, 2018
6cfa5f1
docstring update
apdavison Nov 16, 2018
c943510
Stricter rules for Epoch produced error in NixIO, fixed here
apdavison Nov 16, 2018
17f5473
Merge pull request #604 from samuelgarcia/fix_warnings_0.7
apdavison Nov 16, 2018
ce91521
Merge pull request #599 from samuelgarcia/axonio_doc
apdavison Nov 19, 2018
aeac408
Merge pull request #593 from samuelgarcia/intanrawio
apdavison Nov 19, 2018
76a5584
Merge branch 'arr_anns/test3' of https://github.com/bjoern1001001/pyt…
JuliaSprenger Nov 21, 2018
8389f71
Documentation updates and minor changes
JuliaSprenger Nov 21, 2018
dd59f93
Fix array annotation test for warnings
JuliaSprenger Nov 21, 2018
953972b
Fix failing test due to invalid array annotation
JuliaSprenger Nov 22, 2018
07e1134
Merge branch 'master' of github.com:NeuralEnsemble/python-neo into bj…
JuliaSprenger Nov 22, 2018
078912b
Merge pull request #586 from apdavison/event-to-epoch
mdenker Nov 22, 2018
a9f8b30
Minor change in test implementation
JuliaSprenger Nov 23, 2018
9217af1
Resolve merge conflict
JuliaSprenger Nov 23, 2018
78e241a
Travis testing
JuliaSprenger Nov 23, 2018
5e95a0f
Clean up code
JuliaSprenger Nov 23, 2018
89d9d2b
Pep8ification
JuliaSprenger Nov 23, 2018
ef6ad71
Manual pep8 correction
JuliaSprenger Nov 23, 2018
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .circleci/requirements_testing.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ h5py
igor
klusta
tqdm
nixio>=1.4.3
nixio>=1.5.0b2
axographio>=0.3.1
matplotlib
ipython
Expand Down
2 changes: 1 addition & 1 deletion doc/source/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Neo will still install but the IO module that uses them will fail on loading:
* h5py >= 2.5 for Hdf5IO, KwikIO
* klusta for KwikIO
* igor >= 0.2 for IgorIO
* nixio >= 1.2 for NixIO
* nixio >= 1.5 for NixIO
* stfio for StimfitIO


Expand Down
36 changes: 16 additions & 20 deletions neo/core/basesignal.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
http://docs.scipy.org/doc/numpy/user/basics.subclassing.html

In brief:
* Constructor :meth:`__new__` for :class:`BaseSignal` doesn't exist.
* Constructor :meth:`__new__` for :class:`BaseSignal` doesn't exist.
Only child objects :class:`AnalogSignal` and :class:`IrregularlySampledSignal`
can be created.
'''
Expand Down Expand Up @@ -39,9 +39,9 @@ class BaseSignal(DataObject):
This class contains all common methods of both child classes.
It uses the following child class attributes:

:_necessary_attrs: a list of the attributes that the class must have.
:_necessary_attrs: a list of the attributes that the class must have.

:_recommended_attrs: a list of the attributes that the class may
:_recommended_attrs: a list of the attributes that the class may
optionally have.
'''

Expand All @@ -60,9 +60,9 @@ def __array_finalize__(self, obj):

User-specified values are only relevant for construction from
constructor, and these are set in __new__ in the child object.
Then they are just copied over here. Default values for the
Then they are just copied over here. Default values for the
specific attributes for subclasses (:class:`AnalogSignal`
and :class:`IrregularlySampledSignal`) are set in
and :class:`IrregularlySampledSignal`) are set in
:meth:`_array_finalize_spec`
'''
super(BaseSignal, self).__array_finalize__(obj)
Expand Down Expand Up @@ -90,7 +90,7 @@ def _rescale(self, signal, units=None):
'''
Check that units are present, and rescale the signal if necessary.
This is called whenever a new signal is
created from the constructor. See :meth:`__new__' in
created from the constructor. See :meth:`__new__' in
:class:`AnalogSignal` and :class:`IrregularlySampledSignal`
'''
if units is None:
Expand Down Expand Up @@ -183,8 +183,8 @@ def _copy_data_complement(self, other):
setattr(self, attr[0], getattr(other, attr[0], None))
setattr(self, 'annotations', getattr(other, 'annotations', None))

# Note: Array annotations cannot be copied because length of data can be changed
# here which would cause inconsistencies
# Note: Array annotations cannot be copied because length of data can be changed # here
# which would cause inconsistencies

def __rsub__(self, other, *args):
'''
Expand Down Expand Up @@ -264,29 +264,25 @@ def merge(self, other):
kwargs[name] = attr_self
else:
kwargs[name] = "merge(%s, %s)" % (attr_self, attr_other)
merged_annotations = merge_annotations(self.annotations,
other.annotations)
merged_annotations = merge_annotations(self.annotations, other.annotations)
kwargs.update(merged_annotations)

kwargs['array_annotations'] = self._merge_array_annotations(other)

signal = self.__class__(stack, units=self.units, dtype=self.dtype,
copy=False, t_start=self.t_start,
sampling_rate=self.sampling_rate,
**kwargs)
signal = self.__class__(stack, units=self.units, dtype=self.dtype, copy=False,
t_start=self.t_start, sampling_rate=self.sampling_rate, **kwargs)
signal.segment = self.segment

if hasattr(self, "lazy_shape"):
signal.lazy_shape = merged_lazy_shape

# merge channel_index (move to ChannelIndex.merge()?)
if self.channel_index and other.channel_index:
signal.channel_index = ChannelIndex(
index=np.arange(signal.shape[1]),
channel_ids=np.hstack([self.channel_index.channel_ids,
other.channel_index.channel_ids]),
channel_names=np.hstack([self.channel_index.channel_names,
other.channel_index.channel_names]))
signal.channel_index = ChannelIndex(index=np.arange(signal.shape[1]),
channel_ids=np.hstack(
[self.channel_index.channel_ids, other.channel_index.channel_ids]),
channel_names=np.hstack(
[self.channel_index.channel_names, other.channel_index.channel_names]))
else:
signal.channel_index = ChannelIndex(index=np.arange(signal.shape[1]))

Expand Down
99 changes: 53 additions & 46 deletions neo/core/dataobject.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,16 +12,23 @@
import numpy as np
from neo.core.baseneo import BaseNeo, _check_annotations

# TODO: If yes, then should array annotations as a whole also be a property?


def _normalize_array_annotations(value, length):
"""Check consistency of array annotations

"""
Recursively check that value is either an array or list containing only "simple" types
(number, string, date/time) or is a dict of those.
:return The array_annotations from value in correct form
:raises ValueError: In case value is not accepted as array_annotation(s)

Args:
:value: (np.ndarray, list or dict) value to be checked for consistency
:length: (int) required length of the array annotation

Returns:
np.ndarray The array_annotations from value in correct form

Raises:
ValueError: In case value is not accepted as array_annotation(s)

"""

# First stage, resolve dict of annotations into single annotations
Expand All @@ -33,16 +40,14 @@ def _normalize_array_annotations(value, length):

elif value is None:
raise ValueError("Array annotations must not be None")
# If not array annotation, pass on to regular check and make it a list,
# that is checked again
# If not array annotation, pass on to regular check and make it a list, that is checked again
# This covers array annotations with length 1
elif not isinstance(value, (list, np.ndarray)) or \
(isinstance(value, pq.Quantity) and value.shape == ()):
elif not isinstance(value, (list, np.ndarray)) or (
isinstance(value, pq.Quantity) and value.shape == ()):
_check_annotations(value)
value = _normalize_array_annotations(np.array([value]), length)

# If array annotation, check for correct length,
# only single dimension and allowed data
# If array annotation, check for correct length, only single dimension and allowed data
else:
# Get length that is required for array annotations, which is equal to the length
# of the object's data
Expand All @@ -56,26 +61,26 @@ def _normalize_array_annotations(value, length):
value = np.ndarray((0,))
val_length = own_length
else:
# Note: len(o) also works for np.ndarray, it then uses the outmost dimension,
# Note: len(o) also works for np.ndarray, it then uses the first dimension,
# which is exactly the desired behaviour here
val_length = len(value)

if not own_length == val_length:
raise ValueError("Incorrect length of array annotation: {} != {}".
format(val_length, own_length))
raise ValueError(
"Incorrect length of array annotation: {} != {}".format(val_length, own_length))

# Local function used to check single elements of a list or an array
# They must not be lists or arrays and fit the usual annotation data types
def _check_single_elem(element):
# Nested array annotations not allowed currently
# So if an entry is a list or a np.ndarray, it's not allowed,
# except if it's a quantity of length 1
if isinstance(element, list) or \
(isinstance(element, np.ndarray) and not
(isinstance(element, pq.Quantity) and element.shape == ())):
# If element is a list or a np.ndarray, it's not conform except if it's a quantity of
# length 1
if isinstance(element, list) or (isinstance(element, np.ndarray) and not (
isinstance(element, pq.Quantity) and (
element.shape == () or element.shape == (1,)))):
raise ValueError("Array annotations should only be 1-dimensional")
if isinstance(element, dict):
raise ValueError("Dicts are not supported array annotations")
raise ValueError("Dictionaries are not supported as array annotations")

# Perform regular check for elements of array or list
_check_annotations(element)
Expand All @@ -86,19 +91,19 @@ def _check_single_elem(element):
# Thus just performing a check on the first element is enough
# Even if it's a pq.Quantity, which can be scalar or array, this is still true
# Because a np.ndarray cannot contain scalars and sequences simultaneously
try:

# If length of data is 0, then nothing needs to be checked
if len(value):
# Perform check on first element
_check_single_elem(value[0])
except IndexError:
# If length of data is 0, then nothing needs to be checked
pass

return value

# In case of list, it needs to be ensured that all data are of the same type
else:

# Conversion to numpy array makes all elements same type
# Converts elements to most general type

try:
value = np.array(value)
# Except when scalar and non-scalar values are mixed, this causes conversion to fail
Expand Down Expand Up @@ -137,17 +142,25 @@ class DataObject(BaseNeo, pq.Quantity):
- returning it as pq.Quantity or np.ndarray
- handling of array_annotations

Array_annotations are a kind of annotations that contain metadata for every data point,
Array_annotations are a kind of annotation that contains metadata for every data point,
i.e. per timestamp (in SpikeTrain, Event and Epoch) or signal channel (in AnalogSignal
and IrregularlySampledSignal).
They can contain the same data types as regular annotations, but are always represented
as numpy arrays of the same length as the number of data points of the annotated neo object.

Args:
name (str, optional): Name of the Neo object
description (str, optional): Human readable string description of the Neo object
file_origin (str, optional): Origin of the data contained in this Neo object
array_annotations (dict, optional): Dictionary containing arrays / lists which annotate
individual data points of the Neo object.
kwargs: regular annotations stored in a separate annotation dictionary
'''

def __init__(self, name=None, description=None, file_origin=None, array_annotations=None,
**annotations):
"""
This method is called from each data object and initializes the newly created object by
This method is called by each data object and initializes the newly created object by
adding array annotations and calling __init__ of the super class, where more annotations
and attributes are processed.
"""
Expand All @@ -157,13 +170,14 @@ def __init__(self, name=None, description=None, file_origin=None, array_annotati
if array_annotations is not None:
self.array_annotate(**array_annotations)

BaseNeo.__init__(self, name=name, description=description,
file_origin=file_origin, **annotations)
BaseNeo.__init__(self, name=name, description=description, file_origin=file_origin,
**annotations)

def array_annotate(self, **array_annotations):

"""
Add annotations (non-standardized metadata) as arrays to a Neo data object.
Add array annotations (annotations for individual data points) as arrays to a Neo data
object.

Example:

Expand Down Expand Up @@ -218,8 +232,6 @@ def _merge_array_annotations(self, other):
:return Merged array_annotations
'''

# Make sure the user is notified for every object about which exact annotations are lost
warnings.simplefilter('always', UserWarning)
merged_array_annotations = {}
omitted_keys_self = []
# Concatenating arrays for each key
Expand All @@ -234,7 +246,7 @@ def _merge_array_annotations(self, other):
except ValueError:
raise ValueError("Could not merge array annotations "
"due to different units")
merged_array_annotations[key] = np.append(value, other_value)*value.units
merged_array_annotations[key] = np.append(value, other_value) * value.units
else:
merged_array_annotations[key] = np.append(value, other_value)

Expand All @@ -243,17 +255,15 @@ def _merge_array_annotations(self, other):
omitted_keys_self.append(key)
continue
# Also save omitted keys from 'other'
omitted_keys_other = [key for key in other.array_annotations
if key not in self.array_annotations]
omitted_keys_other = [key for key in other.array_annotations if
key not in self.array_annotations]

# Warn if keys were omitted
if omitted_keys_other or omitted_keys_self:
warnings.warn("The following array annotations were omitted, because they were only "
"present in one of the merged objects: {} from the one that was merged "
"into and {} from the one that was merged into the other".
format(omitted_keys_self, omitted_keys_other), UserWarning)

# Reset warning filter to default state
warnings.simplefilter("default")
"into and {} from the one that was merged into the other"
"".format(omitted_keys_self, omitted_keys_other), UserWarning)

# Return the merged array_annotations
return merged_array_annotations
Expand All @@ -270,9 +280,7 @@ def rescale(self, units):
return self.copy()

# Rescale the object into a new object
# Works for all objects currently
obj = self.duplicate_with_new_data(signal=self.view(pq.Quantity).rescale(dim),
units=units)
obj = self.duplicate_with_new_data(signal=self.view(pq.Quantity).rescale(dim), units=units)

# Expected behavior is deepcopy, so deepcopying array_annotations
obj.array_annotations = copy.deepcopy(self.array_annotations)
Expand Down Expand Up @@ -315,12 +323,11 @@ def _get_arr_ann_length(self):
This is the last dimension of every object.
:return Required length of array annotations for this object
"""
# Number of items is last dimension in current objects
# This holds true for the current implementation
# Number of items is last dimension in of data object
# This method should be overridden in case this changes
try:
length = self.shape[-1]
# XXX This is because __getitem__[int] returns a scalar Epoch/Event/SpikeTrain
# Note: This is because __getitem__[int] returns a scalar Epoch/Event/SpikeTrain
# To be removed if __getitem__[int] is changed
except IndexError:
length = 1
Expand Down
Loading