Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PyNIO backend doesn't play well with open_mfdataset #936

Closed
shoyer opened this issue Aug 2, 2016 · 3 comments
Closed

PyNIO backend doesn't play well with open_mfdataset #936

shoyer opened this issue Aug 2, 2016 · 3 comments

Comments

@shoyer
Copy link
Member

shoyer commented Aug 2, 2016

As reported on StackOverflow: http://stackoverflow.com/questions/38711915/segmentation-fault-writing-xarray-datset-to-netcdf-or-dataframe/

It appears that we can only open a single file at a time with pynio?

Adding a thread lock via lock=True didn't solve the issue.

cc @david-ian-brown

@david-ian-brown
Copy link

Hi Stephan,

I will look into this issue.
-dave

On Tue, Aug 2, 2016 at 11:01 AM, Stephan Hoyer notifications@github.com
wrote:

As reported on StackOverflow:
http://stackoverflow.com/questions/38711915/segmentation-fault-writing-xarray-datset-to-netcdf-or-dataframe/

It appears that we can only open a single file at a time with pynio?

Adding a thread lock via lock=True didn't solve the issue.

cc @david-ian-brown https://github.com/david-ian-brown


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#936, or mute the thread
https://github.com/notifications/unsubscribe-auth/AB7VbedP7yIJsI6DEz0PLOnVJJBDadaKks5qb3gDgaJpZM4Ja1AT
.

@david-ian-brown
Copy link

Hi Stephan,
I notice that you posted a solution of sorts on stackoverflow. Is this a
real fix or a bandaid for the problem? I ran a few tests with pynio by
itself where there was no problem opening e.g. a GRIB file for reading and
a NetCDF file for writing at the same time.
I am wondering if the issue involves using pynio and the netCDF4 modules at
the same time. Let me know if still think something in pynio is not working
correctly.
Thanks.
-dave

On Tue, Aug 2, 2016 at 1:35 PM, David Brown dbrown@ucar.edu wrote:

Hi Stephan,

I will look into this issue.
-dave

On Tue, Aug 2, 2016 at 11:01 AM, Stephan Hoyer notifications@github.com
wrote:

As reported on StackOverflow: http://stackoverflow.com/
questions/38711915/segmentation-fault-writing-xarray-datset-to-netcdf-or-
dataframe/

It appears that we can only open a single file at a time with pynio?

Adding a thread lock via lock=True didn't solve the issue.

cc @david-ian-brown https://github.com/david-ian-brown


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#936, or mute the thread
https://github.com/notifications/unsubscribe-auth/AB7VbedP7yIJsI6DEz0PLOnVJJBDadaKks5qb3gDgaJpZM4Ja1AT
.

@shoyer
Copy link
Member Author

shoyer commented Aug 10, 2016

The fix I posted on stackoverflow is a bandaid solution -- it requires loading every file into memory all at once, which can be problematic if you have large amounts of data.

It occurs to me that the problem might be that we were attempting to concurrently load data from multiple variables in a single file at once. If this is the issue, then it's something we can work around pretty easily with xarray. I'll run some tests later to verify.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants