-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New environment for oceanography image on SciServer #222
Conversation
all set for v0.2
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
installing oceanspy
with git and keeping xesmf<0.5.0
seems to have done the trick !
Yes ! I'm almost finished testing it. The tests are passing and all the notebooks are running, except the Kogur one which is taking a bit of time. Once it finishes, I think we're all set then to send the environment to Mitya. |
Codecov Report
@@ Coverage Diff @@
## master #222 +/- ##
==========================================
+ Coverage 95.30% 96.69% +1.39%
==========================================
Files 10 10
Lines 3809 3569 -240
Branches 849 766 -83
==========================================
- Hits 3630 3451 -179
+ Misses 99 67 -32
+ Partials 80 51 -29
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
Okay folks, @Mikejmnez @malmans2 , the tests are passing now that we've pinned |
You can just test if it is going to work by just creating the od with llc4320: od = ospy.open_dataset.from_catalog("LLC4320") if it works, then that is it! The only issue it that intake is able to read multiple zarrs in parallel. |
I ran my ECCO notebook. It works, but shucks ! It gave me an error with the LLC4320 dataset. ---------------------------------------------------------------------------
OSError Traceback (most recent call last)
File <timed exec>:1, in <module>
File ~/workspace/Storage/asiddi24/persistent/oceanspy/oceanspy/open_oceandataset.py:141, in from_catalog(name, catalog_url)
138 mtdt = cat[entry].metadata
140 # Create ds
--> 141 ds = cat[entry].to_dask()
142 else:
143 # Pop args and metadata
144 args = cat[entry].pop("args")
File ~/miniconda3/envs/Oceanography_test/lib/python3.8/site-packages/intake_xarray/base.py:69, in DataSourceMixin.to_dask(self)
67 def to_dask(self):
68 """Return xarray object where variables are dask arrays"""
---> 69 return self.read_chunked()
File ~/miniconda3/envs/Oceanography_test/lib/python3.8/site-packages/intake_xarray/base.py:44, in DataSourceMixin.read_chunked(self)
42 def read_chunked(self):
43 """Return xarray object (which will have chunks)"""
---> 44 self._load_metadata()
45 return self._ds
File ~/miniconda3/envs/Oceanography_test/lib/python3.8/site-packages/intake/source/base.py:236, in DataSourceBase._load_metadata(self)
234 """load metadata only if needed"""
235 if self._schema is None:
--> 236 self._schema = self._get_schema()
237 self.dtype = self._schema.dtype
238 self.shape = self._schema.shape
File ~/miniconda3/envs/Oceanography_test/lib/python3.8/site-packages/intake_xarray/base.py:18, in DataSourceMixin._get_schema(self)
15 self.urlpath = self._get_cache(self.urlpath)[0]
17 if self._ds is None:
---> 18 self._open_dataset()
20 metadata = {
21 'dims': dict(self._ds.dims),
22 'data_vars': {k: list(self._ds[k].coords)
23 for k in self._ds.data_vars.keys()},
24 'coords': tuple(self._ds.coords.keys()),
25 }
26 if getattr(self, 'on_server', False):
File ~/miniconda3/envs/Oceanography_test/lib/python3.8/site-packages/intake_xarray/xzarr.py:44, in ZarrSource._open_dataset(self)
42 kw.setdefault("backend_kwargs", {})["storage_options"] = self.storage_options
43 if isinstance(self.urlpath, list) or "*" in self.urlpath:
---> 44 self._ds = xr.open_mfdataset(self.urlpath, **kw)
45 else:
46 self._ds = xr.open_dataset(self.urlpath, **kw)
File ~/miniconda3/envs/Oceanography_test/lib/python3.8/site-packages/xarray/backends/api.py:873, in open_mfdataset(paths, chunks, concat_dim, compat, preprocess, engine, data_vars, coords, combine, parallel, join, attrs_file, combine_attrs, **kwargs)
870 paths = [os.fspath(p) if isinstance(p, os.PathLike) else p for p in paths]
872 if not paths:
--> 873 raise OSError("no files to open")
875 if combine == "nested":
876 if isinstance(concat_dim, (str, DataArray)) or concat_dim is None:
OSError: no files to open The version of |
@asiddi24 The |
Aha ! of course. I was running it all in Ocean Circulation. Works like a charm now ! Yay ! lol, I'm having so much fun with this. Time to sleep now. Okay, we're all set. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have a few suggestions you can commit directly on GH if make sense to you.
In general, I would install as much packages as possible as we are not updating the image very often.
Additionally, I would
- Install python 3.9 rather than 3.8. Let's use the best one we support!
- Install
xmitgcm
stable release using conda. I tested it in Pin cartopy #218 and all tests passed, but I'm not up to speed with LLC stuff. @Mikejmnez do you see any downside in using v0.5.2?
BTW, can we close #218? Looks like you pinned cartopy here.
This all sounds good to me. If can, we should use the stable release of |
I'll make these changes and test the environment again today. I'll let you folks know how it goes. |
Co-authored-by: Mattia Almansi <mattia.almansi@noc.ac.uk>
Co-authored-by: Mattia Almansi <mattia.almansi@noc.ac.uk>
@asiddi24 Almost good to go. I think you forgot to use the stable release of |
Updating the environment file for the new oceanography image.
Will keep committing changes to this while we work on updating the package list.
Closes #220
Closes #221