Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data source improvements #138

Merged
merged 28 commits into from
Sep 21, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
c915e84
refactored DataSource, updated code and tests
AleksMat Sep 2, 2020
e0572d5
removed a few OGC integration tests and added 1 more
AleksMat Sep 2, 2020
dfa6899
fixed a new pylint issue about exceptions
AleksMat Sep 2, 2020
5b9ddeb
In case of Python 3.6 install dataclasses package
AleksMat Sep 2, 2020
3984335
minor update
AleksMat Sep 3, 2020
9a4ba93
added minor fixes
AleksMat Sep 3, 2020
9fa31a1
improvements regarding data sources and processing API
AleksMat Sep 3, 2020
e2b542b
updated OGC examples and a DataSource.get_available_sources
AleksMat Sep 3, 2020
fead982
improvements of data source handling
AleksMat Sep 4, 2020
5e057cd
handled cases if data source name or definition has already been used
AleksMat Sep 4, 2020
653bb43
updated processing api notebook
AleksMat Sep 4, 2020
968a24e
Added notebook about data sources, updated documentation, minor impro…
AleksMat Sep 7, 2020
76ad92a
minor pylint fix
AleksMat Sep 7, 2020
c7abfd8
WFS url change according to data source
AleksMat Sep 7, 2020
2a3ba5e
a few typo fixes
AleksMat Sep 8, 2020
64a1a33
minor imporvements in data sources
AleksMat Sep 8, 2020
a56fcac
renamed data source to data collection - updated everything
AleksMat Sep 10, 2020
531e2a0
added swath mode
AleksMat Sep 10, 2020
9f357e4
added timeliness data source parameter
AleksMat Sep 10, 2020
dd67d2d
minor fix
AleksMat Sep 10, 2020
a77b317
updated notebooks regarding new data collection parameters
AleksMat Sep 10, 2020
448efbc
better handling of data_collection / data_source parameters
AleksMat Sep 17, 2020
4af4ec7
commented out a problematic geopedia test
AleksMat Sep 17, 2020
188e3ee
minor import fix in ogc examples
AleksMat Sep 17, 2020
8eb1048
moved handle_deprecated_data_source to data_collections
AleksMat Sep 17, 2020
85b08d5
InputDataDict instead of base_url parameter in SentinelHubRequest
AleksMat Sep 17, 2020
8fc0859
marked geopedia wms stats test with xfail
AleksMat Sep 18, 2020
bb72154
Merge pull request #143 from sentinel-hub/feat/sh-request-base-url-ha…
AleksMat Sep 18, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
6 changes: 6 additions & 0 deletions docs/source/data_collections.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
data_collections
================

.. automodule:: sentinelhub.data_collections
:members:
:show-inheritance:
1 change: 1 addition & 0 deletions docs/source/docs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ Modules
aws_safe
config
constants
data_collections
data_request
download.aws_client
download.client
Expand Down
1 change: 1 addition & 0 deletions docs/source/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ Examples
:maxdepth: 4

examples/processing_api_request.ipynb
examples/data_collections.ipynb
examples/ogc_request.ipynb
examples/large_area_utilities.ipynb
examples/fis_request.ipynb
Expand Down
37 changes: 26 additions & 11 deletions examples/aws_request.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@
}
],
"source": [
"from sentinelhub import WebFeatureService, BBox, CRS, DataSource, SHConfig\n",
"from sentinelhub import WebFeatureService, BBox, CRS, DataCollection, SHConfig\n",
"\n",
"INSTANCE_ID = '' # In case you put instance ID into configuration file you can leave this unchanged\n",
"\n",
Expand All @@ -104,7 +104,7 @@
"wfs_iterator = WebFeatureService(\n",
" search_bbox,\n",
" search_time_interval,\n",
" data_source=DataSource.SENTINEL2_L1C,\n",
" data_collection=DataCollection.SENTINEL2_L1C,\n",
" maxcc=1.0,\n",
" config=config\n",
")\n",
Expand Down Expand Up @@ -234,7 +234,7 @@
" bands=bands,\n",
" metafiles=metafiles,\n",
" data_folder=data_folder,\n",
" data_source=DataSource.SENTINEL2_L1C\n",
" data_collection=DataCollection.SENTINEL2_L1C\n",
")\n",
"\n",
"request.save_data() # This is where the download is triggered"
Expand Down Expand Up @@ -354,9 +354,16 @@
"metadata": {},
"outputs": [],
"source": [
"tile_request = AwsTileRequest(tile=tile_name, time=time, aws_index=aws_index, \n",
" bands=bands, metafiles=metafiles, data_folder=data_folder,\n",
" safe_format=True)\n",
"tile_request = AwsTileRequest(\n",
" tile=tile_name,\n",
" time=time,\n",
" aws_index=aws_index,\n",
" data_collection=DataCollection.SENTINEL2_L1C,\n",
" bands=bands,\n",
" metafiles=metafiles,\n",
" data_folder=data_folder,\n",
" safe_format=True\n",
")\n",
"\n",
"# Uncomment the the following line to download the data:\n",
"# tile_request.save_data()"
Expand All @@ -370,8 +377,12 @@
"source": [
"product_id = 'S2A_OPER_PRD_MSIL1C_PDMC_20160121T043931_R069_V20160103T171947_20160103T171947'\n",
"\n",
"product_request = AwsProductRequest(product_id=product_id, bands=['B01'],\n",
" data_folder=data_folder, safe_format=True)\n",
"product_request = AwsProductRequest(\n",
" product_id=product_id,\n",
" bands=['B01'],\n",
" data_folder=data_folder,\n",
" safe_format=True\n",
")\n",
"\n",
"# Uncomment the the following line to download the data:\n",
"# product_request.save_data()"
Expand All @@ -390,8 +401,12 @@
"metadata": {},
"outputs": [],
"source": [
"product_request = AwsProductRequest(product_id=product_id, tile_list=['T14PNA', 'T13PHT'],\n",
" data_folder=data_folder, safe_format=True)\n",
"product_request = AwsProductRequest(\n",
" product_id=product_id,\n",
" tile_list=['T14PNA', 'T13PHT'],\n",
" data_folder=data_folder,\n",
" safe_format=True\n",
")\n",
"\n",
"# Uncomment the the following line to download the data:\n",
"# product_request.save_data()"
Expand All @@ -414,7 +429,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.0"
"version": "3.8.2"
}
},
"nbformat": 4,
Expand Down
690 changes: 690 additions & 0 deletions examples/data_collections.ipynb

Large diffs are not rendered by default.

8 changes: 5 additions & 3 deletions examples/fis_request.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
"from shapely.geometry import Polygon\n",
"\n",
"from sentinelhub import FisRequest, BBox, Geometry, CRS, WcsRequest, CustomUrlParam, \\\n",
" DataSource, HistogramType\n",
" DataCollection, HistogramType\n",
"from sentinelhub.time_utils import iso_to_datetime"
]
},
Expand Down Expand Up @@ -104,6 +104,7 @@
"outputs": [],
"source": [
"fis_request = FisRequest(\n",
" data_collection=DataCollection.SENTINEL2_L1C,\n",
" layer='BANDS-S2-L1C',\n",
" geometry_list=[sahara_bbox],\n",
" time=time_interval,\n",
Expand Down Expand Up @@ -412,6 +413,7 @@
"outputs": [],
"source": [
"wcs_request = WcsRequest(\n",
" data_collection=DataCollection.SENTINEL2_L1C,\n",
" layer='TRUE-COLOR-S2-L1C',\n",
" bbox= sahara_bbox,\n",
" time=time_interval,\n",
Expand Down Expand Up @@ -531,13 +533,13 @@
"ndvi_script = 'return [(B05 - B04) / (B05 + B04)]'\n",
"\n",
"histogram_request = FisRequest(\n",
" data_collection=DataCollection.LANDSAT8,\n",
" layer='TRUE-COLOR-L8',\n",
" geometry_list=[bbox1, bbox2, geometry1, geometry2],\n",
" time=('2018-06-10', '2018-06-15'),\n",
" resolution='100m',\n",
" bins=20,\n",
" histogram_type=HistogramType.EQUIDISTANT,\n",
" data_source=DataSource.LANDSAT8,\n",
" custom_url_params={CustomUrlParam.EVALSCRIPT: ndvi_script},\n",
" config=config\n",
")\n",
Expand Down Expand Up @@ -657,7 +659,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.0"
"version": "3.8.2"
}
},
"nbformat": 4,
Expand Down
8 changes: 4 additions & 4 deletions examples/large_area_utilities.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
"from shapely.geometry import shape, Polygon, MultiPolygon, MultiLineString\n",
"\n",
"from sentinelhub import BBoxSplitter, OsmSplitter, TileSplitter, CustomGridSplitter, UtmZoneSplitter, UtmGridSplitter\n",
"from sentinelhub import BBox, read_data, CRS, DataSource"
"from sentinelhub import BBox, read_data, CRS, DataCollection"
]
},
{
Expand Down Expand Up @@ -417,7 +417,7 @@
"source": [
"### Splitting in satellite's tile grid\n",
"\n",
"If we would like to work on a level of satellite tiles and split them we can use the `TileSplitter`. It works in combination with Sentinel Hub WFS service therefore an instance ID is required. We also need to specify `time_interval` and `data_source`."
"If we would like to work on a level of satellite tiles and split them we can use the `TileSplitter`. It works in combination with Sentinel Hub WFS service therefore an instance ID is required. We also need to specify `time_interval` and `data_collection`."
]
},
{
Expand Down Expand Up @@ -457,7 +457,7 @@
" [hawaii_area],\n",
" CRS.WGS84,\n",
" ('2017-10-01', '2018-03-01'),\n",
" data_source=DataSource.SENTINEL2_L1C,\n",
" data_collection=DataCollection.SENTINEL2_L1C,\n",
" config=config\n",
")\n",
"\n",
Expand Down Expand Up @@ -553,7 +553,7 @@
" CRS.WGS84,\n",
" ('2017-10-01', '2018-03-01'),\n",
" tile_split_shape=(7, 3),\n",
" data_source=DataSource.SENTINEL2_L1C,\n",
" data_collection=DataCollection.SENTINEL2_L1C,\n",
" config=config\n",
")\n",
"\n",
Expand Down
214 changes: 116 additions & 98 deletions examples/ogc_request.ipynb

Large diffs are not rendered by default.