-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HDF5 backend for xray #66
Comments
Here's an elementary example of using HDF5 memory images to pass self-describing binary data between processes using pytables: import numpy as np
import os
import tables
pipe_name = '/tmp/my-pipe'
driver = "H5FD_CORE"
my_array_name = "My array"
my_attribute = "Drummer is cool!"
my_title = "My title"
def child():
h5_file = tables.open_file("in-memory", title=my_title, mode="w",
driver=driver, driver_core_backing_store=0)
h5_file.create_array(h5_file.root, "array",
np.array([0., -1., 1., -2., 2.]),
title=my_array_name)
h5_file.root.array.attrs.my_attribute = my_attribute
image = h5_file.get_file_image()
h5_file.close()
pipeout = open(pipe_name, 'w')
pipeout.write(image)
pipeout.flush()
def parent():
pipein = open(pipe_name, 'r')
image = pipein.read()
h5_file = tables.open_file("in-memory", mode="r", driver=driver,
driver_core_image=image,
driver_core_backing_store=0)
print("my_title is \"%s\"." % h5_file.title)
print("my_attribute is \"%s\"." % h5_file.root.array.attrs.my_attribute)
print("my_array_name is \"%s\"." % h5_file.root.array.title)
print("array data is \"%s\"." % str(h5_file.root.array[:]))
h5_file.close()
if not os.path.exists(pipe_name):
os.mkfifo(pipe_name)
pid = os.fork()
if pid:
parent()
else:
child() |
Thanks @ToddSmall! |
I did a little bit of research into the HDF5 file-format last night and how it maps on the NetCDF data model: https://www.unidata.ucar.edu/software/netcdf/docs/netcdf/NetCDF_002d4-Format.html HDF5 has a notion of "dimension scales" which implement shared dimensions. The bad news is that pytables does not support them, although h5py does. As @ToddSmall shows in his example above, pytables supports getting file images for HDF5 files, but unfortunately h5py does not implement file image operations. So it looks like there are not currently any existing solutions that will let us implement our data model in HDF5 with file images :(. On the plus side, it does look like it would be pretty simple to implement the NetCDF4 file format directly via h5py. This is something worth considering, because the codebase for the h5py project looks much cleaner than netCDF4-python and has better test coverage. I can also verify that it is straightforward to open and interpret NetCDF4 files via pytables or h5py. |
I'm really enjoying working with xray, it's so nice to be able to think of my dimensions as named and labeled dimensions, no more remembering which axis is which! I'm not sure if this is relevant to this specific issue, but I am working for the most part with HDF5 files created using h5py. I'm only just learning about NetCDF-4, but I have datasets that comprise a number of 1D and 2D variables with shared dimensions, so I think my data is already very close to the right model. I have a couple of questions: (1) If I have multiple datasets within an HDF5 file, each within a separate group, can I access those through xray? (2) What would I need to add to my HDF5 to make it fully compliant with the xray/NetCDF4 model? Is it just a question of creating and attaching dimension scales or would I need to do something else as well? Thanks in advance. |
Glad you're enjoying xray! From your description it sounds like it should be relatively simple for you to get xray working with your dataset. NetCDF4 is a subset of h5py and simply adding dimension scales should get you most of the way there. Re: groups, each xray.Dataset corresponds to one HDF5 group. So while xray doesn't currently support groups, you could split your HDF5 dataset into separate files for each group and load those files using xray. Alternatively (if you feel ambitious) it shouldn't be too hard to get xray's NetCDF4DataStore (backends.netCDF4_.py) to work with groups, allowing you to do something like:
This gives some good examples of how groups work within the netCDF4. Also, as @shoyer mentioned, it might make sense to modify xray so that NetCDF4 support is obtained by wrapping h5py instead of netCDF4 which might make your life even easier. |
Thanks @akleeman for the info, much appreciated. A couple of other points I thought maybe worth mentioning if you're First I've been using lzf as the compression filter in my HDF5 files. I Second, I have a situation where I have multiple datasets, each of which is On Mon, May 12, 2014 at 3:04 PM, akleeman notifications@github.com wrote:
Alistair Miles |
One other detail, I have an HDF5 group for each conceptual dataset, but then variables may be organised into subgroups. It would be nice if this could be accommodated, e.g., when opening an HDF5 group as an xray dataset, assume the dataset contains all variables in the group and any subgroups searched recursively. Again apologies I don't know if this is allowed in NetCDF4, will do the research. |
In principle, I think dimension scales are all we need to interpret HDF5 files as xray Datasets. That's also most of what you need to make a netCDF4 file, but I would not be surprised if NetCDF libraries have issues with HDF5 files that don't conform to every last NetCDF convention. For reference, here is the full NetCDF4 spec (pretty short!): We don't yet support reading from groups or subgroups (other than the root group To support HDF5 properly, including interesting use cases like yours, I think it we should probably write our own interface to h5py, instead of reading everything through the NetCDF libraries. Ideally, we could set this up to write HDF5 as (mostly) valid NetCDF4, at least in the simpler cases where that makes sense. |
I wrote a little library to read and write netCDF4 files via h5py the other day: https://github.com/shoyer/h5netcdf I also merged a preliminary backend for it into xray that should work if you use I've also been looking into the netCDF4 data model in a bit more detail, and the good news is that it looks like it does, at least theoretically, support hierarchical dimension scales. This doesn't work in h5netcdf yet, but would be easy to add. Read support into xray would also be straightforward. Figuring out how to write a hierarchy of xray datasets into the format is less obvious, however. We might need something like a HierarchicalDataset object. I guess using |
Thanks Stephan, I'll take a look. |
Note that h5netcdf won't (yet) let you read any HDF5 files you couldn't already read with netCDF4-python -- it just gives us an alternative backend to use. One thing we could do that's not supported by netCDF is potentially read HDF5 dimension labels. The original netCDF4 library only understands dimension scales -- which, to be honest, seems like a less natural fit to me than reading dimension labels. |
Just to say thanks for the work on this, I've been looking at the h5netcdf
code recently to understand better how dimensions are plumbed in netcdf4.
I'm exploring refactoring all my data model classes in scikit-allel to
build on xarray, I think the time is right, especially if xarray gets a
Zarr backend too.
On Sun, 22 Oct 2017 at 02:01, Stephan Hoyer ***@***.***> wrote:
Closed #66 <#66>.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#66 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AAq8QqPs_6iyjBqHhFoB2CV7blLX8TUYks5supQEgaJpZM4BpxKD>
.
--
Alistair Miles
Head of Epidemiological Informatics
Centre for Genomics and Global Health <http://cggh.org>
Big Data Institute Building
Old Road Campus
Roosevelt Drive
Oxford
OX3 7LF
United Kingdom
Phone: +44 (0)1865 743596
Email: alimanfoo@googlemail.com
Web: http://a <http://purl.org/net/aliman>limanfoo.github.io/
Twitter: https://twitter.com/alimanfoo
|
It's pretty messy, to be honest :). The HDF5 dimension scale API is highly flexible, and netCDF4 only uses a small part of it.
Interesting -- I'd love to hear how this goes! Please don't hesitate to file issues when problems come up (though you're already off to a good start). |
Have there been any developments for HDF5 support? |
Xarray will never be able to read arbitrary HDF5 files. The full HDF5 data model is far more complicated than any data structure xarray supports. Using h5py directly is your best bet for HDF5 files that aren’t also netcdf files. |
* added conda env for building docs * updated testing ci env * point rtd to new conda env * remove unneeded future impor * removed docs requirements file in favour of conda env * try to prevent loading two versions of python simultaneously
The obvious libraries to wrap are pytables or h5py:
http://www.pytables.org
http://h5py.org/
Both provide at least some support for in-memory operations (though I'm not sure if they can pass around HDF5 file objects without dumping them to disk).
From a cursory look at the documentation for both projects, the h5py appears to offer a simpler API that would be easier to map to our existing data model.
The text was updated successfully, but these errors were encountered: