Skip to content

Commit

Permalink
Release version 3.8.1, Merge pull request #374 from sentinel-hub/develop
Browse files Browse the repository at this point in the history
Release version 3.8.1
  • Loading branch information
zigaLuksic committed Jan 18, 2023
2 parents 2fbc1d0 + 4611296 commit 1bedd9a
Show file tree
Hide file tree
Showing 67 changed files with 1,571 additions and 294 deletions.
3 changes: 2 additions & 1 deletion .flake8
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
[flake8]
ignore = E203, W503, C408
# B028 is ignored because !r flags cannot be used in python < 3.8
ignore = E203, W503, C408, B028
exclude = .git, __pycache__
min_python_version = 3.7.0
max-line-length= 120
Expand Down
3 changes: 2 additions & 1 deletion .github/workflows/ci_action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,8 @@ on:
- "master"
- "develop"
schedule:
- cron: "0 0 * * *"
# Schedule events are triggered by whoever last changed the cron schedule
- cron: "5 0 * * *"

env:
# The only way to simulate if-else statement
Expand Down
12 changes: 6 additions & 6 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
rev: v4.4.0
hooks:
- id: end-of-file-fixer
- id: requirements-txt-fixer
Expand All @@ -13,19 +13,19 @@ repos:
- id: debug-statements

- repo: https://github.com/psf/black
rev: 22.8.0
rev: 22.12.0
hooks:
- id: black
language_version: python3

- repo: https://github.com/pycqa/isort
rev: 5.10.1
rev: 5.11.4
hooks:
- id: isort
name: isort (python)

- repo: https://github.com/PyCQA/autoflake
rev: v1.5.3
rev: v2.0.0
hooks:
- id: autoflake
args:
Expand All @@ -36,7 +36,7 @@ repos:
]

- repo: https://github.com/pycqa/flake8
rev: 5.0.4
rev: 6.0.0
hooks:
- id: flake8
additional_dependencies:
Expand All @@ -46,7 +46,7 @@ repos:
- flake8-typing-imports

- repo: https://github.com/nbQA-dev/nbQA
rev: 1.4.0
rev: 1.6.0
hooks:
- id: nbqa-black
- id: nbqa-isort
Expand Down
189 changes: 180 additions & 9 deletions examples/aws_request.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,180 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Accessing satellite data from AWS\n",
"# Accessing satellite data from AWS "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Warning"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The functionalities of the .aws module have mostly been deprecated and are no longer actively maintained. Some recent changes (such as switch in file extensions) render these utilities only partially usable.\n",
"\n",
"The following code can help you patch your scripts in such cases. It interacts with S3 more directly and is easier for you to adjust.\n",
"\n",
"For this patch Sentinel Hub Config has to be configured according to [Configuration paragraph](https://sentinelhub-py.readthedocs.io/en/latest/configure.html#sentinel-hub-capabilities)."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"from datetime import datetime as dt\n",
"\n",
"from sentinelhub import CRS, BBox, DataCollection, SentinelHubCatalog, SHConfig\n",
"from sentinelhub.aws import AwsDownloadClient\n",
"\n",
"boto_params = {\"RequestPayer\": \"requester\"}\n",
"config = SHConfig()\n",
"s3_client = AwsDownloadClient.get_s3_client(config)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"search_bbox = BBox(bbox=[46.16, -16.15, 46.51, -5.58], crs=CRS.WGS84)\n",
"search_time_interval = (dt(2022, 12, 11), dt(2022, 12, 17))\n",
"data_collection = DataCollection.SENTINEL2_L1C # use DataCollection.SENTINEL2_L1C or DataCollection.SENTINEL2_L2A"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"def get_s3_tile_paths(search_bbox, search_time_interval, data_collection, config):\n",
" results = SentinelHubCatalog(config).search(collection=data_collection, bbox=search_bbox, time=search_time_interval)\n",
"\n",
" return [result[\"assets\"][\"data\"][\"href\"] for result in results]"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"['s3://sentinel-s2-l1c/tiles/38/L/PH/2022/12/14/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PJ/2022/12/14/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PK/2022/12/14/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PL/2022/12/14/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PM/2022/12/14/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PN/2022/12/14/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PP/2022/12/14/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PM/2022/12/12/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PN/2022/12/12/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PP/2022/12/12/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PQ/2022/12/12/0/',\n",
" 's3://sentinel-s2-l1c/tiles/38/L/PR/2022/12/12/0/']"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"get_s3_tile_paths(search_bbox, search_time_interval, data_collection, config)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"def list_tile_objects(s3_tile_path):\n",
" \"\"\"Returns list of all files, which are located on `s3 path` on s3 bucket.\"\"\"\n",
" _, _, bucket_name, url_key = s3_tile_path.split(\"/\", 3)\n",
" return s3_client.list_objects_v2(Bucket=bucket_name, Prefix=url_key, **boto_params)[\"Contents\"]"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"tags": []
},
"outputs": [
{
"data": {
"text/plain": [
"[{'Key': 'tiles/38/L/PH/2022/12/14/0/B01.jp2',\n",
" 'LastModified': datetime.datetime(2022, 12, 14, 10, 44, 30, tzinfo=tzutc()),\n",
" 'ETag': '\"0a2ba9a9f7a8e1a7c4c5102d76596b87\"',\n",
" 'Size': 3609242,\n",
" 'StorageClass': 'INTELLIGENT_TIERING'},\n",
" {'Key': 'tiles/38/L/PH/2022/12/14/0/B02.jp2',\n",
" 'LastModified': datetime.datetime(2022, 12, 14, 10, 44, 30, tzinfo=tzutc()),\n",
" 'ETag': '\"20c0b9d7f3bf09d088facfef23d9d23a\"',\n",
" 'Size': 100628227,\n",
" 'StorageClass': 'INTELLIGENT_TIERING'},\n",
" {'Key': 'tiles/38/L/PH/2022/12/14/0/B03.jp2',\n",
" 'LastModified': datetime.datetime(2022, 12, 14, 10, 44, 30, tzinfo=tzutc()),\n",
" 'ETag': '\"381a0db09b39ec7e2a57c96902eb3ab8\"',\n",
" 'Size': 103173386,\n",
" 'StorageClass': 'INTELLIGENT_TIERING'}]"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"list_of_objects = tile_objects_list = list_tile_objects(\"s3://sentinel-s2-l1c/tiles/38/L/PH/2022/12/14/0/\")\n",
"list_of_objects[:3]"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [],
"source": [
"def download(s3_tile_path, download_dir, objects_to_download=None):\n",
" os.makedirs(download_dir, exist_ok=True)\n",
" all_files = list_tile_objects(s3_tile_path)\n",
"\n",
" for file in all_files:\n",
" file_name = file[\"Key\"].split(\"/\")[-1]\n",
" if not objects_to_download or file_name in objects_to_download:\n",
" out_path = os.path.join(download_dir, file_name)\n",
" _, _, bucket_name, _ = s3_tile_path.split(\"/\", 3)\n",
" s3_client.download_file(Bucket=bucket_name, Key=file[\"Key\"], Filename=out_path, ExtraArgs=boto_params)"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
"files_to_download = [\"B04.jp2\", \"B07.jp2\"]\n",
"download(\"s3://sentinel-s2-l1c/tiles/38/L/PH/2022/12/14/0/\", \"local/file/path\", files_to_download)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Accessing satellite data from AWS\n",
"\n",
"This example notebook shows how to obtain Sentinel-2 imagery and additional data from [AWS S3 storage buckets](https://aws.amazon.com/s3/). The data at AWS is the same as original S-2 data provided by ESA.\n",
"\n",
Expand All @@ -30,7 +203,7 @@
"source": [
"Note: `matplotlib` is not a dependency of `sentinelhub` and is used in these examples for visualizations.\n",
"\n",
"## Searching for available data\n",
"### Searching for available data\n",
"\n",
"For this functionality Sentinel Hub instance ID has to be configured according to [Configuration paragraph](https://sentinelhub-py.readthedocs.io/en/latest/configure.html#sentinel-hub-capabilities)."
]
Expand Down Expand Up @@ -159,11 +332,11 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Download data\n",
"### Download data\n",
"\n",
"Once we have found correct tiles or products we can download them and explore the data. Note that in order to do that, you have to provide AWS credentials to the config. Please see also [documentation](https://sentinelhub-py.readthedocs.io/en/latest/configure.html#amazon-s3-capabilities).\n",
"\n",
"### Aws Tile\n",
"#### Aws Tile\n",
"\n",
"Sentinel-2 tile can be uniquely defined either with ESA tile ID (e.g. `L1C_T01WCV_A012011_20171010T003615`) or with tile name (e.g. `T38TML` or `38TML`), sensing time and AWS index. The AWS index is the last number in tile AWS path (e.g. https://roda.sentinel-hub.com/sentinel-s2-l1c/tiles/1/C/CV/2017/1/14/0/ → `0`).\n",
"\n",
Expand Down Expand Up @@ -238,9 +411,7 @@
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"scrolled": false
},
"metadata": {},
"outputs": [],
"source": [
"data_list = request.get_data() # This will not redownload anything because data is already stored on disk\n",
Expand Down Expand Up @@ -303,7 +474,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Aws Product\n",
"#### Aws Product\n",
"\n",
"Sentinel-2 product is uniquely defined by ESA product ID. We can obtain data for the whole product"
]
Expand Down Expand Up @@ -331,7 +502,7 @@
"If `bands` parameter is not defined all bands will be downloaded. If `metafiles` parameter is not defined no additional metadata files will be downloaded.\n",
"\n",
"\n",
"### Data into .SAFE structure\n",
"#### Data into .SAFE structure\n",
"\n",
"The data can also be downloaded into .SAFE structure by specifying `safe_format=True`. The following code will download data from upper example again because now data will be stored in different folder structure."
]
Expand Down
25 changes: 13 additions & 12 deletions examples/batch_statistical.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
" SHConfig,\n",
" monitor_batch_statistical_job,\n",
")\n",
"from sentinelhub.aws.batch import AwsBatchResults\n",
"from sentinelhub.aws.batch import AwsBatchStatisticalResults\n",
"from sentinelhub.data_utils import get_failed_statistical_requests, statistical_to_dataframe"
]
},
Expand Down Expand Up @@ -114,7 +114,11 @@
"outputs": [],
"source": [
"AWS_ID = \"my-aws-access-id\"\n",
"AWS_SECRET = \"my-aws-secret-key\""
"AWS_SECRET = \"my-aws-secret-key\"\n",
"\n",
"# credentials are passed via config\n",
"config.aws_access_key_id = AWS_ID\n",
"config.aws_secret_access_key = AWS_SECRET"
]
},
{
Expand Down Expand Up @@ -396,7 +400,7 @@
"metadata": {},
"outputs": [],
"source": [
"rgb_evalscript = \"\"\"\n",
"ndvi_evalscript = \"\"\"\n",
"//VERSION=3\n",
"\n",
"function setup() {\n",
Expand Down Expand Up @@ -432,7 +436,7 @@
"\"\"\"\n",
"\n",
"aggregation = SentinelHubStatistical.aggregation(\n",
" evalscript=rgb_evalscript,\n",
" evalscript=ndvi_evalscript,\n",
" time_interval=(\"2020-06-01\", \"2020-06-30\"),\n",
" aggregation_interval=\"P1D\",\n",
" resolution=(10, 10),\n",
Expand Down Expand Up @@ -475,7 +479,7 @@
"metadata": {},
"outputs": [],
"source": [
"client = SentinelHubBatchStatistical()"
"client = SentinelHubBatchStatistical(config)"
]
},
{
Expand Down Expand Up @@ -647,7 +651,7 @@
"\n",
"With the batch statistical request completed we do a quick inspection of the results. A folder with the same name as our request is located at `OUTPUT_PATH` and it contains multiple JSON files with results. The JSON files are named after the id of the row, not after our custom `identifier`, so we have files from `1.json` all the way to `100.json` on our bucket. \n",
"\n",
"We can transfer them to the local storage with the help of `AwsBatchResults`."
"We can transfer them to the local storage with the help of `AwsBatchStatisticalResults`."
]
},
{
Expand All @@ -666,13 +670,10 @@
"source": [
"LOCAL_FOLDER = \"batch_output\"\n",
"\n",
"# credentials are passed via config\n",
"config = SHConfig()\n",
"config.aws_access_key_id = AWS_ID\n",
"config.aws_secret_access_key = AWS_SECRET\n",
"\n",
"result_ids = range(1, 51) # let's only download half of the results for now\n",
"batch_results_request = AwsBatchResults(request, data_folder=LOCAL_FOLDER, feature_ids=result_ids, config=config)\n",
"batch_results_request = AwsBatchStatisticalResults(\n",
" request, data_folder=LOCAL_FOLDER, feature_ids=result_ids, config=config\n",
")\n",
"results = batch_results_request.get_data(save_data=True, max_threads=4, show_progress=True)"
]
},
Expand Down

0 comments on commit 1bedd9a

Please sign in to comment.