Skip to content
Rose Pearson edited this page Jul 10, 2024 · 72 revisions

Welcome to the GeoFabrics wiki!

Introduction

The GeoFabrics packages include routines and classes for combining point (i.e. LiDAR), vector (i.e. catchment of interest, infrastructure) and raster (i.e. coarse DEM) to generate a hydrologically conditioned, and a roughness length raster layers.

This work has been initiated under the Endeavour funded Mā te haumaru ō te wai (Increasing flood resilience across Aotearoa) project. The package methodology and case-studies demonstrating its usage can be found in the article.

Support has been added for distributed computing environments using Dask, so that DEM generation can be spread across many processors.

GeoFabrics is a Python library. Black is used to ensure strict adherence to PEP8 standards and a line length of 88.

DEM_generation_workflow

Fetching geospatial data from web APIs

GeoFabrics utilises the geoapis python package for downloading publicly available geospatial data. This is currently maintained by the same people as GeoFabrics and was split out of the GeoFabrics repository.

Documentation

API Docs

Sphinx combined with GitHub Pages is used to create web-hosted documentation from the embedded docstrings in the source code at https://rosepearson.github.io/GeoFabrics/

Wiki pages

Documentation specific to general usage, installation, testing and contribution of GeoFabrics can be found in the Wiki pages. See the sidebar for a listing of all pages.

Package structure

The following diagram shows the package module and class structures. Inheritance is marked through colour connections, and classes included in other classes in indicated with arrows. The processor module contains pipelines that generate DEMs or other outputs based on the contents of instruction files. The geometry, dem and bathymetry_estimation modules contain classes to help with these generation pipelines.

image Note need to update the image with the MeasuredElevationsGenerator, StopbankCrestElevationEstimator, and PatchDemGenerator processor classes.

The framework stages - processor module

The core DEM generation processing chain in GeoFabrics is contained in the processor module. The code flow is controlled within the run routines of the various pipeline classes (inherited from the same abstract BaseProcessor class) based on the contents of an JSON instruction file passed in at construction. The instruction file (link to wiki page) specifies the data sources to use during the DEM generation as well as other code flow logic. Dask is used to allow parallel processing across many CPU cores hydrologically conditioned DEM from LiDAR with more details under performance and bench-marking.

The runner.py script is used as an entry point to the library. It determines which processor class(es) to run based on the instruction file contents. See the instruction file page for details on how runner.py selects which processor classes to run, and the Basic usage page for details on how to run runner.py.

Each of pipeline class supports slightly different functionality as described below.

  • The RawLidarDemGenerator class - Construct a raw DEM from LiDAR tiles. Filter the LiDAR based on the specified ground classifications. This is computationally intensive. The specified averaging is applied to all LiDAR points within resolution / sqrt(2) of each cell centroid. This ensures all points within the cell are considered. In the case of a coarse DEM being used, linear interpolation is applied to all coarse DEM cell centroids within coarse resolution x sqrt(2). This ensures all cells will have a value. Cell grid alignment is based on the resolution (so two DEMs of the same resolution will have the same alignment).

  • The MeasuredRiverGenerator class - Interpolate measured river section elevations such that the interpolated values can be used by the HydrologicDemGenerator. Measured river section elevation and delineated riverbanks are required.

  • The RiverBathymetryGenerator class - Estimate river bathymetry depths from river flows, slopes, frictions and widths along a main channel, where the channel width and slope are estimated from DEM generated from LiDAR. More details at River Bathymetry estimation.

  • The WaterwayBedElevationEstimator class - Estimate waterway bed elevations from waterways either defined in Open Street Maps (OSM), or from a file. If using OSM, drains, streams and ditches are always considered, while rivers can be optionally ignored if omitted from the widths dictionary (they can instead be considered in the MeasuredRiverGenerator or RiverBathymetryGenerator class). In the case of open waterways (i.e. no tunnel tag), no increases in elevation are enforced down slope. In the case of closed waterways (i.e. tunnel tag), the lowest elevation in the area is taken as the bed elevation.

  • The StopbankCrestElevationEstimator class - Estimate stopbank crest elevations from a file specifying the stopbank locations or Open Street Maps (OSM). If using OSM embankments and dykes are considered by default unless other features are specified. Crest elevations are estimated as the highest elevation at evenly spaced locations along the stopbanks.

  • The HydrologicDemGenerator class - Construct a new hydrologically conditioned DEM from a raw DEM created by the RawLidarDemGenerator class and the specified ocean (from LINZ), river (from RiverBathymetryGenerator) and waterways (from WaterwayBedElevationEstimator) bathymetry information.

  • The RoughnessGenerator class - Add a roughness length layer to a hydrologically conditioned DEM, where the roughness length is related to the mean height and standard deviation of the ground and ground cover LiDAR points in each grid cell. More details at Roughness length estimation.

  • The PatchDemGenerator class - Add one of more patch to either the 'z' (elevation) or 'zo' (roughness) layers. These values can either grid aligned or missaligned/at a different resolution. If grid aligned the values are copied exactly, otherwise they are linearly interpolated onto the grid.

Basic instructions for running these processor pipelines are outlined in the Basic Usage Instructions page.

Clone this wiki locally