Skip to content

Project-Sustain/PostFireDebrisFlowRiskAnalysis

Repository files navigation

CALCULATION OF POST-FIRE DEBIS FLOW RISK OVER CONUS

DATA SOURCES

Hydrography->NHDPlus High Resolution (NHDPlus HR)

Digital Evevation map (spatial resolution of 30m)

Interval - 5yrs, Duration- 15mins; Missing a segment of North-Western USA

We use EVT-140 CONUS (2014) data.

STATSGO Database.


SHAPEFILE SIMPLIFICATION

NHD-shapefiles were too large in size and number to support fast visualization. We simplified these shapes to enable fast fetching. The shape_simplification_ogr directory contains scripts for simplification of shapefiles.

  1. Run simplification_nhd_script.sh to simplify the catchment shapes.
  2. Run update_grid_script.sh to modify the GridCode inside the generated jsons to make them unique.

For simplification of the STATSGO shapes, just run simplification_statsgo_script.sh

NOTE TO SELF: Run simplification on lattice-77 and copy and ingest from lattice-101.

DIGITAL ELEVATION MAP ZONAL STATISTICS

Run the following from the directory containing DEM TIFFs:

ls -n | %{py PostFireDebrisRiskAnalysis\arcpy_processing\dem_processing\dem_huc_zstats_w_intrc_chk.py $_}

DNBR ZONAL STATISTICS

Differential Normalized Burn Ratio.

  1. Import the file EVT-140_CONUS_MAIN\US_140EVT_20180618\Grid\us_140evt into ArcMap.
  2. Open Attribute Table and add a field DNBR. RClick and select Field Calculator.
  3. Use the python code \vegetation\pre_logic.py as: DNBR = get_lambda(!Value_1!)
  4. Extract the DNBR field into a single-band TIFF file using delete raster attribute table function.

Run the following to perform Zonal Statistics over NHD catchments:

cd C:\Users\sapmitra\Documents\PostFireDebris\data\CatchmentBoundaries\temp_shapefiles
ls -n | %{py PostFireDebrisRiskAnalysis\arcpy_processing\vegetation_processing\dnbr_zonal_stats_single.py $_}

STATS SOIL ZONAL STATISTICS

Run the following from the directory containing DEM TIFFs:

cd C:\Users\sapmitra\Documents\PostFireDebris\data\SOILS
ls -n | findstr "shp"| %{py C:\Users\sapmitra\PycharmProjects\FireWatcher\data_proc\soils\soils_all_hucs_zonal_stats_single.py $_}

DESIGN STORM

Run the following from the directory containing Design Storm .asc files:

cd C:\Users\sapmitra\Documents\PostFireDebris\data\DesignStorm
ls -n | %{py PostFireDebrisRiskAnalysis\arcpy_processing\design_storm_processing\dstorm_huc_zonal_intrc_chk_single.py $_}

COMBINE BOUNDARIES

Use files PostFireDebrisRiskAnalysis\combine_boundaries\combine*.py to combine the various X1, X2, X3 and Design Storms for overlapping/duplicate catchment boundaries.

THi15

Run combine_boundaries/compute_thi15.py on lattice machines to compute THi15. DNBR (X2) has to be scaled by dividing by 1000.

POST-FIRE RISK

Run PostFireDebrisRiskAnalysis\thi15\compare_thi15_DS.py

INGESTION INTO MONGODB


COMMANDS USED

db.createCollection("nhd_shapes")
db.nhd_shapes.createIndex({geometry : "2dsphere"})
db.X1_Elevation.createIndex({"GridCode": 1},{unique: true});

DATA INGESTION

The PostFireDebrisRiskAnalysis\shape_simplification_ogr\update_grid_script.sh script excludes/includes any shape-files that were missed in the previous run due to incompatibility with mongodb and also updates the GridCode and copies it to the upper level of the json file.

Following this, data Ingestion is done using the script PostFireDebrisRiskAnalysis\ingestion_with_check\insert_script_nhd.sh that logs the NHD shapes that were incompatible into the console.

Computed Attributes relating to each NHD catchment is appended to the corresponding document in the nhd_shapes collection using the merge command as below:

mongoimport --port 27018 --db sustaindb --collection nhd_shapes --mode merge --upsertFields GridCode --headerline --type csv --file thi15.csv

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published