Skip to content

LllC-mmd/SHAFTS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SHAFTS

DOI

SHAFTS is a deep-learning-based Python package for Simultaneous extraction of building Height And FootprinT from Sentinel Imagery

Citation

More details can be found in the model description paper.

Please cite the paper if you use SHAFTS in your research.

@Article{gmd-16-751-2023,
AUTHOR = {Li, R. and Sun, T. and Tian, F. and Ni, G.-H.},
TITLE = {SHAFTS (v2022.3): a deep-learning-based Python package for simultaneous extraction of building height and footprint from sentinel imagery},
JOURNAL = {Geoscientific Model Development},
VOLUME = {16},
YEAR = {2023},
NUMBER = {2},
PAGES = {751--778},
URL = {https://gmd.copernicus.org/articles/16/751/2023/},
DOI = {10.5194/gmd-16-751-2023}
}

Installation

System Requirements

SHAFTS is developed and tested on Linux and macOS.

It should also work on Windows, but we have not tested it yet; but you may consider using Windows Subsystem for Linux to run SHAFTS.

To use SHAFTS, you need to install the following packages, which can be installed via mamba with the provided env.yml file:

mamba env create -f env.yml

Once the environment is created, activate it via:

mamba activate shafts-dev

Install SHAFTS

You can install SHAFTS from PyPI or from source.

Please activate the environment created by mamba before installing SHAFTS.

Install from PyPI

pip install shafts

Install from source

  1. Clone the repository:
git clone https://github.com/LllC-mmd/SHAFTS.git
  1. Install the package:

Change your current directory to the root of the repository and run:

make test

This will install the package in editable mode and run the tests.

Data Download

The input data of SHAFTS may include:

SHAFTS contains some functions which can download above data directly from Google Earth Engine.

Note that according to the guidance for exporting data from Google Earth Engine, we can not export data to any local devices directly. Thus, Google Drive is recommended as a destination where data are export and then we can download exported data to our local devices.

An example for downloading Sentinel-2's image via sentinel2_download_by_extent is given as follows:

from shafts.utils.GEE_ops import sentinel2_download_by_extent

# ---specify the spatial extent and year for Sentinel-2's images
lon_min = -87.740
lat_min = 41.733
lon_max = -87.545
lat_max = 41.996
year = 2018

# ---define the output settings
dst = "Drive"
dst_dir = "Sentinel-2_export"
file_name = "Chicago_2018_sentinel_2.tif"

# ---start data downloading
sentinel2_download_by_extent(lon_min=lon_min, lat_min=lat_min, lon_max=lon_max, lat_max=lat_max,
                                year=year, dst_dir=dst_dir, file_name=file_name, dst=dst)

Also, SHAFTS gives functions such as sentinel1_download, sentinel2_download and srtm_download to download images in a batch way by a .csv file.

Building Height and Footprint prediction

After preparing above necessary images, building height and footprint information can be predicted by:

  • pred_height_from_tiff_DL_patch: using deep-learning-based (DL) models trained by Single-Task-Learning (STL).

  • pred_height_from_tiff_DL_patch_MTL: using deep-learning-based (DL) models trained by Multi-Task-Learning (MTL).

Since the total amount of relevant parameter settings are relatively more than data downloading, potential users can ref to the sample script for prediction named minimum_case_run.py under the example directory. If batch processing is desired, users can ref to the sample script for prediction named case_run.py under the example directory.

Here, we offer pretrained DL models (based on PyTorch) for building height and footprint prediction via the link on Google Drive. All of pretrained DL models are stored as checkpoint.pth.tar.

Note that for each target resolution, we can use STL/MTL models with(out) SRTM data to make predictions:

  • For STL models, models with SRTM data are stored under experiment_1 and models without SRTM data are stored under experiment_2.

  • For MTL models, models with SRTM data are stored under experiment_1 and models without SRTM data are stored under experiment_2. Since MTL models can give both building height and footprint predictions, we only offer full sets of MTL models under the directory of models for building height prediction named height.

Note that all of models offered in the above link requires SRTM images as one of input variables, though more pretrained DL models during package development are collected for performance comparison.

Integration with Google Cloud Ecosystem

If users want to generate building height and footprint maps without downloading Sentinel data to a local machine, SHAFTS offers the function named GBuildingMap which streamlines the workflow of satellite image preprocessing, TFRecord-based dataset management and DL model inference.

An example usage can be given as follows:

from shafts.inference_gcloud import GBuildingMap

# ---specify the mapping extent by the minimum/maximum of longitude and latitude
lon_min = -0.50
lat_min = 51.00
lon_max = 0.4
lat_max = 51.90

# ---specify the year (ranging between 2018 and 2022)
year = 2020

# ---specify a local path to the pretrained SHAFTS's Tensorflow-based models
pretrained_weight = './dl-models/height/check_pt_senet_100m_MTL_TF_gpu'

# ---specify a local output folder for storing building height and footprint maps
output_folder = './results'

# ---specify the Google Cloud Service configuration
GCS_config = {
    'SERVICE_ACCOUNT': '***** Google Cloud Service Account Name *****',
    'GS_ACCOUNT_JSON': '***** Parh to Google Cloud Service Account Credential *****',
    'BUCKET': '***** name of the bucket set for dataset exporting in Google Cloud Storage *****',
    'DATA_FOLDER': '*****  name of the folder which stores the exported dataset under the `BUCKET` *****',
}

# ---launch building height and footprint mapping
GBuildingMap(
    lon_min,
    lat_min,
    lon_max,
    lat_max,
    year,
    dx=0.09,
    dy=0.09,
    precision=3,
    batch_size=256,
    pretrained_model=pretrained_weight,
    GCS_config=GCS_config,
    target_resolution=100,
    num_task_queue=30,
    num_queue_min=2,
    file_prefix='_',
    padding=0.01,
    patch_size_ratio=1,
    s2_cloud_prob_threshold=20,
    s2_cloud_prob_max=80,
    MTL=True,
    removed=True,
    output_folder=output_folder,
)

The execution of this function requires following pre-steps:

  1. Install and initialize the gcloud command-line interface. Users can ref to this link for details.

  2. Create a Google Cloud Project in the Google Cloud console for building height and footprint prediction.

  3. Enable the Earth Engine API for the project.

  4. Set up a bucket in the Google Cloud Storage (GCS) prepared for the storage of some intermediate exported datasets from Google Earth Engine. Please note that the names of the created bucket, its folder for storing intermediate datasets correspond to the BUCKET and DATA_FOLDER in the GCS_config required by the GBuildingMap function. An example of the structure of the GCS's bucket can be given as follows:

    BUCKET/
    |-- DATA_FOLDER/
    |   |-- tmp-exported-dataset.tfrecord.gz
    |   |-- ...
  1. Create a Google Cloud Service's account for the project. If there is already an account, you can keep it without creating an additional one. Please note that the e-mail name of the service account corresponds to the SERVICE_ACCOUNT in the GCS_config required by the GBuildingMap function.

  2. Create a private key in the format of JSON for the service account by clicking the menu for that account via : > key > JSON. Please download the JSON key file locally and the path to the JSON key for the service account corresponds to the GS_ACCOUNT_JSON in the GCS_config required by the GBuildingMap function.

Here we should note that

  • The pretrained models should be based on Tensorflow's implementation. And we offer pretrained MTDL models for building height and footprint mapping at the resolution of 100 m via the link on Google Drive. If your system has CUDA-supported GPUs, please download check_pt_senet_100m_MTL_TF_gpu. Otherwise, please download check_pt_senet_100m_MTL_TF.

  • Based on preliminary tests, building height and footprint mapping for a area of $0.9^\circ\times 0.9^\circ$ might take 20-40 minutes where the majority of time is spent on exporting satellite images. So please control the size of target areas when you are using a laptop for this functionality.

About

simultaneous extraction of building height and footprint

Resources

Stars

Watchers

Forks

Packages

No packages published