Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
load and close with large data files doesn't work #511
I believe I've encountered this issue in previous versions when loading a variable that has a large number of time steps.
From: potter2 <email@example.com:firstname.lastname@example.org>
If I select a large files (672,17,192,288) the load and close freezes UVCDAT. Nothing is loaded and I have to force quit UVCDAT
haven’t tried this with 2.0
Gerald (Jerry) Potter
From: Charles Doutriaux <email@example.com:firstname.lastname@example.org>
I believe I encountered the same issue, except that it didn't freeze my UVCDAT, probably because I've got a rather large 64G phyical RAM.
Also after loading, the data seem to be broken, it is 57499.927934 everywhere every time slice. If I read a smaller subset, e.g.
The UVCDAT version is 2.8.0, installed via conda. And below is a list of the packages in conda, after installing UVCDAT, numpy, scipy, matplotlib and basemap: