Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove PseudoNetCDF #8446

Merged
merged 3 commits into from
Nov 13, 2023
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 0 additions & 1 deletion ci/requirements/all-but-dask.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ dependencies:
- pandas
- pint>=0.22
- pip
- pseudonetcdf
- pydap
- pytest
- pytest-cov
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/environment-py311.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@ dependencies:
- pip
- pooch
- pre-commit
- pseudonetcdf
- pydap
- pytest
- pytest-cov
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/environment-windows-py311.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ dependencies:
- pint>=0.22
- pip
- pre-commit
- pseudonetcdf
- pydap
- pytest
- pytest-cov
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/environment-windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ dependencies:
- pint>=0.22
- pip
- pre-commit
- pseudonetcdf
- pydap
- pytest
- pytest-cov
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,6 @@ dependencies:
- pip
- pooch
- pre-commit
- pseudonetcdf
- pydap
- pytest
- pytest-cov
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/min-all-deps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,6 @@ dependencies:
- pandas=1.4
- pint=0.22
- pip
- pseudonetcdf=3.2
- pydap=3.3
- pytest
- pytest-cov
Expand Down
14 changes: 0 additions & 14 deletions doc/api-hidden.rst
Original file line number Diff line number Diff line change
Expand Up @@ -591,20 +591,6 @@
backends.H5netcdfBackendEntrypoint.guess_can_open
backends.H5netcdfBackendEntrypoint.open_dataset

backends.PseudoNetCDFDataStore.close
backends.PseudoNetCDFDataStore.get_attrs
backends.PseudoNetCDFDataStore.get_dimensions
backends.PseudoNetCDFDataStore.get_encoding
backends.PseudoNetCDFDataStore.get_variables
backends.PseudoNetCDFDataStore.open
backends.PseudoNetCDFDataStore.open_store_variable
backends.PseudoNetCDFDataStore.ds

backends.PseudoNetCDFBackendEntrypoint.description
backends.PseudoNetCDFBackendEntrypoint.url
backends.PseudoNetCDFBackendEntrypoint.guess_can_open
backends.PseudoNetCDFBackendEntrypoint.open_dataset

backends.PydapDataStore.close
backends.PydapDataStore.get_attrs
backends.PydapDataStore.get_dimensions
Expand Down
2 changes: 0 additions & 2 deletions doc/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1117,7 +1117,6 @@ arguments for the ``load_store`` and ``dump_to_store`` Dataset methods:

backends.NetCDF4DataStore
backends.H5NetCDFStore
backends.PseudoNetCDFDataStore
backends.PydapDataStore
backends.ScipyDataStore
backends.ZarrStore
Expand All @@ -1133,7 +1132,6 @@ used filetypes in the xarray universe.

backends.NetCDF4BackendEntrypoint
backends.H5netcdfBackendEntrypoint
backends.PseudoNetCDFBackendEntrypoint
backends.PydapBackendEntrypoint
backends.ScipyBackendEntrypoint
backends.StoreBackendEntrypoint
Expand Down
3 changes: 0 additions & 3 deletions doc/getting-started-guide/installing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,9 +38,6 @@ For netCDF and IO
- `cftime <https://unidata.github.io/cftime>`__: recommended if you
want to encode/decode datetimes for non-standard calendars or dates before
year 1678 or after year 2262.
- `PseudoNetCDF <http://github.com/barronh/pseudonetcdf/>`__: recommended
for accessing CAMx, GEOS-Chem (bpch), NOAA ARL files, ICARTT files
(ffi1001) and many other.
- `iris <https://github.com/scitools/iris>`__: for conversion to and from iris'
Cube objects

Expand Down
21 changes: 0 additions & 21 deletions doc/user-guide/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1308,27 +1308,6 @@ We recommend installing PyNIO via conda::
.. _PyNIO backend is deprecated: https://github.com/pydata/xarray/issues/4491
.. _PyNIO is no longer maintained: https://github.com/NCAR/pynio/issues/53

.. _io.PseudoNetCDF:

Formats supported by PseudoNetCDF
---------------------------------

Xarray can also read CAMx, BPCH, ARL PACKED BIT, and many other file
formats supported by PseudoNetCDF_, if PseudoNetCDF is installed.
PseudoNetCDF can also provide Climate Forecasting Conventions to
CMAQ files. In addition, PseudoNetCDF can automatically register custom
readers that subclass PseudoNetCDF.PseudoNetCDFFile. PseudoNetCDF can
identify readers either heuristically, or by a format specified via a key in
`backend_kwargs`.

To use PseudoNetCDF to read such files, supply
``engine='pseudonetcdf'`` to :py:func:`open_dataset`.

Add ``backend_kwargs={'format': '<format name>'}`` where `<format name>`
options are listed on the PseudoNetCDF page.

.. _PseudoNetCDF: https://github.com/barronh/PseudoNetCDF


CSV and other formats supported by pandas
-----------------------------------------
Expand Down
4 changes: 2 additions & 2 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Breaking changes

Deprecations
~~~~~~~~~~~~

- The PseudoNetCDF backend has been removed. By `Deepak Cherian <https://github.com/dcherian/>`_.
- Supplying dimension-ordered sequences to :py:meth:`DataArray.chunk` &
:py:meth:`Dataset.chunk` is deprecated in favor of supplying a dictionary of
dimensions, or a single ``int`` or ``"auto"`` argument covering all
Expand Down Expand Up @@ -4530,7 +4530,7 @@ Enhancements

- New PseudoNetCDF backend for many Atmospheric data formats including
GEOS-Chem, CAMx, NOAA arlpacked bit and many others. See
:ref:`io.PseudoNetCDF` for more details.
``io.PseudoNetCDF`` for more details.
By `Barron Henderson <https://github.com/barronh>`_.

- The :py:class:`Dataset` constructor now aligns :py:class:`DataArray`
Expand Down
1 change: 0 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,6 @@ module = [
"opt_einsum.*",
"pandas.*",
"pooch.*",
"PseudoNetCDF.*",
"pydap.*",
"pytest.*",
"scipy.*",
Expand Down
6 changes: 0 additions & 6 deletions xarray/backends/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,6 @@
from xarray.backends.memory import InMemoryDataStore
from xarray.backends.netCDF4_ import NetCDF4BackendEntrypoint, NetCDF4DataStore
from xarray.backends.plugins import list_engines, refresh_engines
from xarray.backends.pseudonetcdf_ import (
PseudoNetCDFBackendEntrypoint,
PseudoNetCDFDataStore,
)
from xarray.backends.pydap_ import PydapBackendEntrypoint, PydapDataStore
from xarray.backends.pynio_ import NioDataStore
from xarray.backends.scipy_ import ScipyBackendEntrypoint, ScipyDataStore
Expand All @@ -37,10 +33,8 @@
"ScipyDataStore",
"H5NetCDFStore",
"ZarrStore",
"PseudoNetCDFDataStore",
"H5netcdfBackendEntrypoint",
"NetCDF4BackendEntrypoint",
"PseudoNetCDFBackendEntrypoint",
"PydapBackendEntrypoint",
"ScipyBackendEntrypoint",
"StoreBackendEntrypoint",
Expand Down
19 changes: 8 additions & 11 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@
T_NetcdfEngine = Literal["netcdf4", "scipy", "h5netcdf"]
T_Engine = Union[
T_NetcdfEngine,
Literal["pydap", "pynio", "pseudonetcdf", "zarr"],
Literal["pydap", "pynio", "zarr"],
type[BackendEntrypoint],
str, # no nice typing support for custom backends
None,
Expand All @@ -78,7 +78,6 @@
"pydap": backends.PydapDataStore.open,
"h5netcdf": backends.H5NetCDFStore.open,
"pynio": backends.NioDataStore,
"pseudonetcdf": backends.PseudoNetCDFDataStore.open,
"zarr": backends.ZarrStore.open_group,
}

Expand Down Expand Up @@ -420,7 +419,7 @@ def open_dataset(
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"pseudonetcdf", "zarr", None}, installed backend \
"zarr", None}, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down Expand Up @@ -452,8 +451,7 @@ def open_dataset(
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend. This keyword may not be supported by all the backends.
be replaced by NA. This keyword may not be supported by all the backends.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
Expand Down Expand Up @@ -523,7 +521,7 @@ def open_dataset(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"scipy", "pynio", "pseudonetcdf".
"scipy", "pynio".

See engine open function for kwargs accepted by each specific engine.

Expand Down Expand Up @@ -628,7 +626,7 @@ def open_dataarray(
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"pseudonetcdf", "zarr", None}, installed backend \
"zarr", None}, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down Expand Up @@ -658,8 +656,7 @@ def open_dataarray(
taken from variable attributes (if they exist). If the `_FillValue` or
`missing_value` attribute contains multiple values a warning will be
issued and all array values matching one of the multiple values will
be replaced by NA. mask_and_scale defaults to True except for the
pseudonetcdf backend. This keyword may not be supported by all the backends.
be replaced by NA. This keyword may not be supported by all the backends.
decode_times : bool, optional
If True, decode times encoded in the standard NetCDF datetime format
into datetime objects. Otherwise, leave them encoded as numbers.
Expand Down Expand Up @@ -729,7 +726,7 @@ def open_dataarray(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"scipy", "pynio", "pseudonetcdf".
"scipy", "pynio".

See engine open function for kwargs accepted by each specific engine.

Expand Down Expand Up @@ -869,7 +866,7 @@ def open_mfdataset(
You can find the file-name from which each dataset was loaded in
``ds.encoding["source"]``.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"pseudonetcdf", "zarr", None}, installed backend \
"zarr", None}, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
is chosen based on available dependencies, with a preference for
Expand Down