Skip to content

openEO Python Client v0.16.0

Compare
Choose a tag to compare
@soxofaan soxofaan released this 17 Apr 12:48
· 395 commits to master since this release

Added

  • Full support for user-uploaded files (/files endpoints) (#377)
  • Initial, experimental "local processing" feature to use openEO Python Client Library functionality on local GeoTIFF/NetCDF files and also do the processing locally using the openeo_processes_dask package (#338)
  • Add BatchJob.get_results_metadata_url().

Changed

  • Connection.list_files() returns a list of UserFile objects instead of a list of metadata dictionaries. Use UserFile.metadata to get the original dictionary. (#377)
  • DataCube.aggregate_spatial() returns a VectorCube now, instead of a DataCube (#386). The (experimental) fit_class_random_forest() and fit_regr_random_forest() methods moved accordingly to the VectorCube class.
  • Improved documentation on openeo.processes and ProcessBuilder (#390).
  • DataCube.create_job() and Connection.create_job() now require keyword arguments for all but the first argument for clarity. (#412).
  • Pass minimum log level to backend when retrieving batch job and secondary service logs. (Open-EO/openeo-api#485, Open-EO/openeo-python-driver#170)

Removed

  • Dropped support for pre-1.0.0 versions of the openEO API (#134):
    • Remove ImageCollectionClient and related helpers (now unused leftovers from version 0.4.0 and earlier). (Also #100)
    • Drop support for pre-1.0.0 job result metadata
    • Require at least version 1.0.0 of the openEO API for a back-end in Connection and all its methods.

Fixed

  • Reinstated old behavior of authentication related user files (e.g. refresh token store) on Windows: when PrivateJsonFile may be readable by others, just log a message instead of raising PermissionError (387)
  • VectorCube.create_job() and MlModel.create_job() are properly aligned with DataCube.create_job() regarding setting job title, description, etc. (#412).
  • More robust handling of billing currency/plans in capabilities (#414)
  • Avoid blindly adding a save_result node from DataCube.execute_batch() when there is already one (#401)