Improved conditions for closing nwb when using hdf5 backend#3150
Merged
alejoe91 merged 2 commits intoSpikeInterface:mainfrom Jul 8, 2024
Merged
Improved conditions for closing nwb when using hdf5 backend#3150alejoe91 merged 2 commits intoSpikeInterface:mainfrom
alejoe91 merged 2 commits intoSpikeInterface:mainfrom
Conversation
alejoe91
reviewed
Jul 5, 2024
Comment on lines
+629
to
+643
| def _close_hdf5_file(self): | ||
| has_hdf5_backend = hasattr(self, "_file") | ||
| if has_hdf5_backend: | ||
| import h5py | ||
|
|
||
| main_file_id = self._file.id | ||
| open_object_ids_main = h5py.h5f.get_obj_ids(main_file_id, types=h5py.h5f.OBJ_ALL) | ||
| for object_id in open_object_ids_main: | ||
| object_name = h5py.h5i.get_name(object_id).decode("utf-8") | ||
| try: | ||
| object_id.close() | ||
| except: | ||
| import warnings | ||
|
|
||
| warnings.warn(f"Error closing object {object_name}") |
Member
There was a problem hiding this comment.
since this is the same function as the Sorting object, why not making a separate unique function?
def _close_hdf5_file(extraxctor):
has_hdf5_backend = hasattr(extraxctor, "_file")
....
then you would just call it in the __del__ with _close_hdf5_file(self)
Collaborator
Author
There was a problem hiding this comment.
You are right that this repetition is unecessary but I wanted to use self, check out the latest commit. I implemented a mixing class that should hold the methods that need the state of the nwb extractor and are common to both recording and sorting.
alejoe91
approved these changes
Jul 8, 2024
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
So we have been battling this for a while. Our access pattern of nwb when using the hdf5 backend directly (instead of the pynwb API) leaves files references dangling.
When looking a this, I see that we are keeping references to some groups and datasets even after closing the main reference. I am not sure if it is the weak reference or the way we access some properties were we don't copy at read.
At some point I have to look deeply into this but meanwhile this should alleviate some of the problems. This uses the low level API of h5py to access all the references to a particular id and tries to close them when calling del. It did solve my issue that I was facing but I am too skeptical to think this is the final nail on the coffin.