Skip to content

Conversation

@dizcza
Copy link
Member

@dizcza dizcza commented Nov 2, 2020

The code to reproduce the bug:

import neo

# curl https://web.gin.g-node.org/INM-6/elephant-data/raw/master/dataset-1/dataset-1.h5 --output dataset-1.h5 --location
hdf5neo = neo.io.NeoHdf5IO("dataset-1.h5")
block = hdf5neo.read_block()

dataset.value threw a deprecation warning in h5py v2.x and became an error in v3.0.0. Now it works for both h5py v2 and v3.

Deprecation warning in v2.x:

.../site-packages/h5py/_hl/dataset.py:313: H5pyDeprecationWarning: 
"dataset.value has been deprecated. Use dataset[()] instead. Use dataset[()] instead.", H5pyDeprecationWarning)

Fixed error in v3.0.0

      1 hdf5neo = neo.io.NeoHdf5IO("dataset-1.h5")
----> 2 block = hdf5neo.read_block()

~/anaconda3/envs/temp/lib/python3.6/site-packages/neo/io/hdf5io.py in read_block(self, lazy, **kargs)
     87         """
     88         assert not lazy, 'Do not support lazy'
---> 89         return self.read_all_blocks(lazy=lazy)[0]
     90 
     91     def _read_block(self, node):

~/anaconda3/envs/temp/lib/python3.6/site-packages/neo/io/hdf5io.py in read_all_blocks(self, lazy, merge_singles, **kargs)
     79         for name, node in self._data.items():
     80             if "Block" in name:
---> 81                 blocks.append(self._read_block(node))
     82         return blocks
     83 

~/anaconda3/envs/temp/lib/python3.6/site-packages/neo/io/hdf5io.py in _read_block(self, node)
     97         for name, child_node in node['segments'].items():
     98             if "Segment" in name:
---> 99                 block.segments.append(self._read_segment(child_node, parent=block))
    100 
    101         if len(node['recordingchannelgroups']) > 0:

~/anaconda3/envs/temp/lib/python3.6/site-packages/neo/io/hdf5io.py in _read_segment(self, node, parent)
    172         for name, child_node in node['spiketrains'].items():
    173             if "SpikeTrain" in name:
--> 174                 spiketrains.append(self._read_spiketrain(child_node, parent=segment))
    175         segment.spiketrains = spiketrains
    176 

~/anaconda3/envs/temp/lib/python3.6/site-packages/neo/io/hdf5io.py in _read_spiketrain(self, node, parent)
    203     def _read_spiketrain(self, node, parent):
    204         attributes = self._get_standard_attributes(node)
--> 205         t_start = self._get_quantity(node["t_start"])
    206         t_stop = self._get_quantity(node["t_stop"])
    207         # todo: handle sampling_rate, waveforms, left_sweep

~/anaconda3/envs/temp/lib/python3.6/site-packages/neo/io/hdf5io.py in _get_quantity(self, node)
    312 
    313     def _get_quantity(self, node):
--> 314         value = node.value
    315         unit_str = [x for x in node.attrs.keys() if "unit" in x][0].split("__")[1]
    316         units = getattr(pq, unit_str)

AttributeError: 'Dataset' object has no attribute 'value'

@JuliaSprenger
Copy link
Member

Hi @dizcza! Thanks for fixing the master branch.

@JuliaSprenger JuliaSprenger merged commit e618198 into NeuralEnsemble:master Nov 2, 2020
@dizcza dizcza deleted the fix/hdf5io branch November 2, 2020 16:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants