diff --git a/README.md b/README.md index 31b5b48..c716463 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@ ![OSARIS](https://cryo-tools.org/wp-content/uploads/2019/01/OSARIS-logo-600px.png) ### Open Source SAR Investigation System -OSARIS provides a framework to process large stacks of synthetic aperture radar (SAR) data in High Performance Computing (HPC) environments. +OSARIS provides a framework to process large stacks of [Sentinel-1](http://www.esa.int/Our_Activities/Observing_the_Earth/Copernicus/Sentinel-1/Introducing_Sentinel-1) Synthetic Aperture Radar (SAR) data in High Performance Computing (HPC) environments. ## Table of Contents * [Introduction](#introduction) @@ -10,6 +10,7 @@ OSARIS provides a framework to process large stacks of synthetic aperture radar * [Installation](#installation) * [Initial configuration](#initial-configuration) * [Launch](#launch) +* [Tutorial](#tutorial) * [Tipps](#tipps) * [Modules](#modules) * [Concept](#module-concept) @@ -26,14 +27,14 @@ With the advent of the two Sentinel 1 satellites, high-quality Synthetic Apertur Key features of OSARIS are: - Convenient configuration (only one main config file, documented templates for all configuration) - Modular structure for flexible processing schemes, modules for a variety of tasks readily available -- Minimal software requirements (bash, csh, GMT, GMTSAR, Slurm) +- Minimalist software requirements (bash, csh, GMT, GMTSAR, Slurm) - Automatic download of relevant Sentinel-1 scenes based on area of interest (AOI) and time interval - Automatic download and assignment of orbits - Merging of multiple swaths - Merging of bursts from multiple slices, omission of bursts that are outside the AOI - Single-master and pair-wise processing schemes - Clear and simple directory structure and file naming -- Output files in the form of analysis-ready geocoded stacks of grid files, optionally also cut to AOI +- Output files in the form of analysis-ready geocoded stacks of grid files, optionally cut to AOI extent - Processing time measurements (wall clock versus total processing time) - Detailed report and log files - Summary PDF showing key processing results for each time step (see module Summary PDF) @@ -45,7 +46,7 @@ Key features of OSARIS are: 1. A working installation of [GMTSAR](http://gmt.soest.hawaii.edu/projects/gmt5sar/wiki) 2. A working SLURM environment, further info and installation instructions at https://slurm.schedmd.com/ -3. [ImageMagick](https://www.imagemagick.org/script/index.php) (optional, required only by the 'Create PDF summary' module) +3. [ImageMagick](https://www.imagemagick.org/script/index.php) (optional, required only by the 'Summary PDF' module) ### Installation Just clone the OSARIS repository to your machine: @@ -75,8 +76,11 @@ Go to the OSARIS folder. Launch your run with ./osaris.sh ./config/.config ``` +## Tutorial +If you are new to Sentinel-1 processing, take a look at the [OSARIS Tutorial at CryTools.org](https://cryo-tools.org/tools/osaris/osaris-tutorial-1/) that will guide you through an example, starting from the very basics of finding data. + ## Tipps -- Launch OSARIS from within a [tmux](https://github.com/tmux/tmux/wiki) or [screen](https://www.gnu.org/software/screen/) session to detach your terminal session from the process. Doing this will prevent the OSARIS processing to fail in case you lose connection, your terminal crashes, etc. (besides numerous other advantages of using tmux/screen). +- Launch OSARIS from within a [tmux](https://github.com/tmux/tmux/wiki) or [screen](https://www.gnu.org/software/screen/) session to detach your terminal session from the process. Doing this will prevent the OSARIS processing to fail in case you lose connection, your terminal crashes, etc. Tmux' feature to arrange multiple windows and panes is extremely handy to monitor log files during processing (see below). - Start with relatively few scenes and a minimum of modules. Check the output and optimize your configuration. When the basic processing results fit your needs, use the options to turn off pre- and interferometric processing and start adding modules. @@ -88,7 +92,7 @@ to monitor what is going on. - After processing, take a look at the reports in 'Output/Reports'. -- Use the 'create_pdf_summary' module to get an overview of the interferometric processing results. +- Use the 'summary_pdf' module to get an overview of the interferometric processing results. - Make sure the DEM extent is not much bigger than the extent of the scenes you actually want to process. A big DEM will need a lot of extra processing time. @@ -98,9 +102,9 @@ to monitor what is going on. Modules allow to execute additional processing routines at different stages, i.e. after file downloads, after file extraction, after GMTSAR processing, and after post-processing (more module hooks may be added in the future). As such, OSARIS modules facilitate designing processing schemes that fit individual needs while keeping the core code as compact as possible. -In order to execute a module, go to the 'MODULES' section in the config file and put the module name (i.e. the name of the subdirectory of modules/) into the array of the adequate hook. For example, if you would like to execute 'Stable Ground Point Identification', 'Harmonize Interferogram Time Series', and 'Create PDF Summary' after GMTSAR interferometric processing, this would be: +In order to execute a module, go to the 'MODULES' section in the config file and put the module name (i.e. the name of the subdirectory of modules/) into the array of the adequate hook. For example, if you would like to execute 'Stable Ground Point Identification', 'Harmonize Grids', and 'Summary PDF' after GMTSAR interferometric processing, this would be: ```sh -post_processing_mods=( SGP_identification harmonize_intfs create_pdf_summary ) +post_processing_mods=( SGP_identification harmonize_grids summary_pdf ) ``` When multiple modules are allocated at one hook the modules will be executed in the same order they appear in the array. Most modules require a config file; A template configuration should be in templates/modules-config which must be copied to the config directory for the module to work: @@ -140,11 +144,6 @@ Shift grid files relative to 'stable ground points'. Typically used to harmonize Call: harmonize_grids Status: beta -#### Summary PDF -Preview key processing results in a single graphic overview. Requires ImageMagick. -Call: summary_pdf -Status: beta - #### Ping Wake up sleeping nodes. Call: ping @@ -160,6 +159,11 @@ Calculate statistics for a series of grid files. Call: statistics Status: beta +#### Summary PDF +Preview key processing results in a single graphic overview. Requires ImageMagick. +Call: summary_pdf +Status: beta + #### Timeseries xy Extract values for particular coordinates throughout a series of grids (e.g. coherence, phase). Call: timerseries_xy @@ -172,7 +176,6 @@ Status: beta - ### Module development The easiest way to get started developing your own OSAIRS module is by copying the template files prepared for this purpose: ```console @@ -205,6 +208,7 @@ For all reports and inquiries please contact [David Loibl](https://hu.berlin/dav ## Crediting OSARIS As stated in the license file, you are free to use OSARIS in whatever way you like. If you publish OSARIS results, e.g. in scientific publications, please credit OSARIS using the Zenodo DOI: + [![DOI](https://zenodo.org/badge/108271075.svg)](https://zenodo.org/badge/latestdoi/108271075) diff --git a/doc/CHANGELOG.md b/doc/CHANGELOG.md index ae984ae..53f1a13 100644 --- a/doc/CHANGELOG.md +++ b/doc/CHANGELOG.md @@ -4,8 +4,23 @@ All notable changes to this OSARIS will be documented in this file. The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/) and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html). - -## [0.7.0] - 2019-01-25 + +## [0.7.2] - 2019-02-07 +### Added +- Parallel processing option for Summary PDF Module +- Improved burst handling, now also stripping unused bursts in single slice configurations + +### Bugs fixed +- Reporting in GACOS correction module +- File downloads + + +## [0.7.1] - 2019-01-28 +### Bugs fixed +- File downloads + + +## [0.7.0] - 2019-01-25 ### Added - Functionality to merge multiple swaths - Cutting of output files to an area of interest defined by boundary box coordinates in the config file diff --git a/lib/GMTSAR-mods/align_cut_tops.csh b/lib/GMTSAR-mods/align_cut_tops.csh index afdef26..36f0ec1 100755 --- a/lib/GMTSAR-mods/align_cut_tops.csh +++ b/lib/GMTSAR-mods/align_cut_tops.csh @@ -175,7 +175,9 @@ gmt grdmath atmp.grd FLIPUD = a.grd # # make the new PRM files and SLC # +echo; echo "Executing make_s1a_tops with parameters "$mxml" "$mtiff" "$mpre 1 make_s1a_tops $mxml $mtiff $mpre 1 +echo; echo "Executing make_s1a_tops with parameters "$sxml" "$stiff" "$spre" 1 r.grd a.grd" make_s1a_tops $sxml $stiff $spre 1 r.grd a.grd # # resamp the slave and set the aoffset to zero diff --git a/lib/PP-pairs.sh b/lib/PP-pairs.sh index 57c63b0..acd1433 100644 --- a/lib/PP-pairs.sh +++ b/lib/PP-pairs.sh @@ -34,8 +34,8 @@ log_PATH=$base_PATH/$prefix/Output/Log job_ID=${previous_scene:15:8}--${current_scene:15:8} -proc_mode=$( cat $work_PATH/proc_mode.txt ) -echo "Processing mode: $proc_mode" +# proc_mode=$( cat $work_PATH/proc_mode.txt ) +# echo "Processing mode: $proc_mode" mkdir -pv $work_PATH/$job_ID/F$swath/raw mkdir -pv $work_PATH/$job_ID/F$swath/topo @@ -48,21 +48,28 @@ cd $work_PATH/raw/$job_ID-aligned/ echo; echo "- - - - - - - - - - - - - - - - - - - - " -if [ "$proc_mode" = "multislice" ]; then - echo "Starting align_cut_tops.csh with options:" - echo "Scene 1: $previous_scene" - echo "Orbit 1: $previous_orbit" - echo "Scene 2: $current_scene" - echo "Orbit 2: $current_orbit"; echo - $OSARIS_PATH/lib/GMTSAR-mods/align_cut_tops.csh $previous_scene $previous_orbit $current_scene $current_orbit dem.grd -else - echo "Starting align_cut_tops.csh with options:" - echo "Scene 1: $previous_scene" - echo "Orbit 1: $previous_orbit" - echo "Scene 2: $current_scene" - echo "Orbit 2: $current_orbit"; echo - align_tops.csh $previous_scene $previous_orbit $current_scene $current_orbit dem.grd -fi +echo "Starting align_cut_tops.csh with options:" +echo "Scene 1: $previous_scene" +echo "Orbit 1: $previous_orbit" +echo "Scene 2: $current_scene" +echo "Orbit 2: $current_orbit"; echo +$OSARIS_PATH/lib/GMTSAR-mods/align_cut_tops.csh $previous_scene $previous_orbit $current_scene $current_orbit dem.grd + +# if [ "$proc_mode" = "multislice" ]; then +# echo "Starting align_cut_tops.csh with options:" +# echo "Scene 1: $previous_scene" +# echo "Orbit 1: $previous_orbit" +# echo "Scene 2: $current_scene" +# echo "Orbit 2: $current_orbit"; echo +# $OSARIS_PATH/lib/GMTSAR-mods/align_cut_tops.csh $previous_scene $previous_orbit $current_scene $current_orbit dem.grd +# else +# echo "Starting align_cut_tops.csh with options:" +# echo "Scene 1: $previous_scene" +# echo "Orbit 1: $previous_orbit" +# echo "Scene 2: $current_scene" +# echo "Orbit 2: $current_orbit"; echo +# align_tops.csh $previous_scene $previous_orbit $current_scene $current_orbit dem.grd +# fi # INTERFEROMETRIC PROCESSING diff --git a/lib/prepare-data.sh b/lib/prepare-data.sh index ed8a062..dc6db9f 100755 --- a/lib/prepare-data.sh +++ b/lib/prepare-data.sh @@ -53,7 +53,9 @@ else mkdir -p $work_PATH/preprocessing/filelists - rm -f $work_PATH/preprocessing/filelists/filelist-multislice.txt $work_PATH/preprocessing/filelists/filelist-singleslice.txt + mkdir -p $work_PATH/preprocessing/raw + filelist_PATH="$work_PATH/preprocessing/filelists/filelist.txt" + rm -f $filelist_PATH # Check whether there are scenes originating from the same pass in orig directory ... @@ -62,303 +64,241 @@ else cd $work_PATH/orig/ scene_list=(*.SAFE) - i=0; for scene in ${scene_list[@]}; do scene_dates[$i]=${scene:17:8}; ((i++)); done - readarray -t scene_dates_sorted < <(printf '%s\0' "${scene_dates[@]}" | sort -z | xargs -0n1) - - multiple_slices=() - single_slices=() - i=0 - for scene_date in ${scene_dates_sorted[@]}; do - if [ "$i" -gt 0 ]; then - if [ "${scene_dates_sorted[$i]}" -eq $prev_scene_date ]; then - echo; echo "Found two matching scenes for $prev_scene_date." - multiple_slices+=($prev_scene_date) - find . -maxdepth 1 -name "*$prev_scene_date*.SAFE" - find . -maxdepth 1 -name "*$prev_scene_date*.SAFE" >> $work_PATH/preprocessing/filelists/filelist-multislice.txt - elif [[ ! " ${multiple_slices[@]} " =~ " ${prev_scene_date} " ]] && [ ! "${scene_dates_sorted[$i]}" -eq $prev_scene_date ]; then - echo "No matching scene for $prev_scene_date found. " - find . -maxdepth 1 -name "*$prev_scene_date*.SAFE" >> $work_PATH/preprocessing/filelists/filelist-singleslice.txt - single_slices+=($prev_scene_date) - fi - prev_scene_date=${scene_dates_sorted[$i]} - else - prev_scene_date=${scene_dates_sorted[$i]} - fi + i=0; + for scene in ${scene_list[@]}; do + echo $scene >> $filelist_PATH + scene_dates[$i]=${scene:17:8} ((i++)) - done - + done + readarray -t scene_dates_sorted < <(printf '%s\0' "${scene_dates[@]}" | sort -z | xargs -0n1) + scene_dates_unique=($(echo "${scene_dates_sorted[@]}" | tr ' ' '\n' | sort -u | tr '\n' ' ')) if [ "$debug" -ge 1 ]; then - echo; echo; echo "Multiple slice scenes (${#multiple_slices[@]}):" - echo "${multiple_slices[@]}" - echo; echo "Single slice scenes (${#single_slices[@]}):" - echo "${single_slices[@]}" + echo; echo "Scene dates sorted:" + echo "${scene_dates_sorted[@]}" + echo; echo "Scene dates unique:" + echo "${scene_dates_unique[@]}" fi + + + for scene_date in ${scene_dates_unique[@]}; do + + # Check if there are multiple files for a date + file_count=$( grep $scene_date $filelist_PATH | wc -l ) - if [ "${#multiple_slices[@]}" -ge 1 ]; then - - echo "Found ${#multiple_slices[@]} scenes with multiple slices for one date." - echo "Continuing with merging of slices." - echo "(In case this is unintended, make sure your Input directory contains only one scene per date)" - echo "multislice" > $work_PATH/proc_mode.txt - if [ "${#single_slices[@]}" -ge 1 ]; then - # TODO (?): In case of both multi and single slice scenes, process both with optimized parameterization - echo "NOTICE: There are ${#single_slices[@]} scenes for which only one slice was found. These will not be processed." - fi - filelist_PATH="$work_PATH/preprocessing/filelists/filelist-multislice.txt" - elif [ "${#single_slices[@]}" -ge 1 ]; then - echo "Found no scenes with multiple slices for one date." - echo "singleslice" > $work_PATH/proc_mode.txt - filelist_PATH="$work_PATH/preprocessing/filelists/filelist-singleslice.txt" - fi + echo; echo "Scene date: $scene_date" + echo "File count: $file_count" + + orbit_match=() + S1_files=() + for ((i=1;i<=$file_count;++i)); do + + S1_files[$i]=$( grep $scene_date $filelist_PATH | awk -v l="${i}" 'NR==l' ) + S1_file=${S1_files[$i]} + echo "i: $i" + echo "S1_files[i]: ${S1_files[$i]}" + # echo "S1_file: ${S1_file}" + cp -n $work_PATH/orig/${S1_files[$i]}/manifest.safe $work_PATH/raw/${S1_files[$i]:17:8}_manifest.safe + target_sensor=$( echo ${S1_files[$i]:0:3} | tr '[:lower:]' '[:upper:]' ) + target_date=$( date -d \ + "${S1_files[$i]:17:8} ${S1_files[$i]:26:2}:${S1_files[$i]:28:2}:${S1_files[$i]:30:2}" '+%s' ) + echo "Target date raw: ${S1_files[$i]:17:8} ${S1_files[$i]:26:2}:${S1_files[$i]:28:2}:${S1_files[$i]:30:2}" + echo "Target date sec: $target_date" + + orbit_counter=1 + orbit_match="none" + + for orbit in $orbit_list; do + if [ ! -z "$orbit" ] && [ "${orbit:42:8}" != " " ]; then + date_string="${orbit:42:8} ${orbit:51:2}:${orbit:53:2}:${orbit:55:2}" + orbit_startdate=$( date -d "$date_string" '+%s' ) + orbit_starttime=${orbit:34:6} + orbit_sensor=${orbit:0:3} + + if [ "$debug" -eq 2 ]; then + echo "Now working on orbit #: $orbit_counter - $orbit" + echo 'Orbit sensor: ' $orbit_sensor + echo 'Orbit start date: ' $orbit_startdate + echo 'Orbit start time: ' $orbit_starttime + fi + + if [ "$orbit_sensor" == "$target_sensor" ]; then + if [ -z ${prev_orbit_startdate} ] || [ -z ${orbit_startdate} ]; then + echo "Orbit date not configured properly ($prev_orbit_startdate - $orbit_startdate)... Skipping." + prev_orbit=$orbit + prev_orbit_startdate=$orbit_startdate + else + if [ "$target_date" -ge "$prev_orbit_startdate" ] && [ "$target_date" -lt "$orbit_startdate" ]; then + # Looks like we found a matching orbit + # TODO: perform further checks, e.g. end_date overlap + + orbit_match=$prev_orbit + echo "Found matching orbit file: $orbit_match" + ln -sf $orbits_PATH/$orbit_match $work_PATH/raw/ + echo $orbit_match + break + else + # No match again, get prepared for another round + prev_orbit=$orbit + prev_orbit_startdate=$orbit_startdate + fi + + fi + fi - # Walk through all files in list ... - i=1 - while read -r current_file; do + fi + + ((orbit_counter++)) + done - current_file=${current_file:2} - if [ "$debug" -ge 1 ]; then - echo; echo; echo "Iteration $i" - echo "Current file: $current_file" - echo "Current date: ${current_file:17:8}" - echo "Previous file: $prev_file" - echo "Previous date: ${prev_file:17:8}" - fi + done + # If a matching orbits were found -> Prepare files and add to respective data.in files + if [ ! "${orbit_match}" == "none" ]; then - - # In each second iteration, cut the pair to AoI extent ... - if [ "$i" -gt 1 ]; then - if [ "${current_file:17:8}" -eq "${prev_file:17:8}" ]; then + for swath in ${swaths_to_process[@]}; do + # Start the slice merge and burst cut procedure + + name_stems=() + prefixes=() + for S1_file in ${S1_files[@]}; do + cd $work_PATH/orig/${S1_file}/annotation/ + tmp_stem=$(ls *iw$swath*) + name_stems+=(${tmp_stem::-4}) + done - S1_file=$current_file - target_scene=${S1_file} - target_sensor=$( echo ${target_scene:0:3} | tr '[:lower:]' '[:upper:]' ) - - if [ -z $target_scene ]; then - echo "Skipping scene $target_scene ..." - else - target_date=$( date -d "${target_scene:17:8} ${target_scene:26:2}:${target_scene:28:2}:${target_scene:30:2}" '+%s' ) - echo "Target date raw: ${target_scene:17:8} ${target_scene:26:2}:${target_scene:28:2}:${target_scene:30:2}" - echo "Target date sec: $target_date" + if [ $debug -ge 1 ]; then + echo; echo "Name stems:" + echo "${name_stems[@]}" fi - + # Sort stems by time + rm -f $work_PATH/stem_list.tmp + i=1 + for stem in ${name_stems[@]}; do + echo "${stem:24:6} $stem ${S1_files[$i]}" >> $work_PATH/stem_list.tmp + ((i++)) + done - if [ "${#multiple_slices[@]}" -eq 0 ]; then - if [ "$debug" -ge 1 ]; then - echo - echo Opening SAFE file: - echo $work_PATH/orig/${S1_file[$i]}.SAFE - echo - fi - - cp $work_PATH/orig/${S1_file[$i]}SAFE/manifest.safe $work_PATH/raw/${S1_package:17:8}_manifest.safe - - cd $work_PATH/orig/${S1_file[$i]}SAFE/annotation/ - swath_names=($( ls *.xml )) - - cd $work_PATH/raw/ - - # [FROM STACK processing -> excluded] - # - # In order to correct for Elevation Antenna Pattern Change, cat the manifest and aux files to the xmls - # delete the first line of the manifest file as it's not a typical xml file. - # awk 'NR>1 {print $0}' < ${S1_package:17:8}_manifest.safe > tmp_file - # cat $work_PATH/orig/${S1_file[$i]}.SAFE/annotation/${swath_names[0]} tmp_file $work_PATH/orig/s1a-aux-cal.xml > ./${swath_names[0]} - - swath_counter=1 - for swath_name in ${swath_names[@]}; do - swath_names[$swath_counter]=${swath_name::-4} - ((swath_counter++)) - done - - if [ "$debug" -ge 1 ]; then - echo "SWATH NAME 1: ${swath_names[1]}" - echo "SWATH NAME 2: ${swath_names[2]}" - echo "SWATH NAME 3: ${swath_names[3]}" + sort $work_PATH/stem_list.tmp > $work_PATH/stem_list_sorted.tmp + stems_chrono=($( cat $work_PATH/stem_list_sorted.tmp | awk '{print $2}' )) + files_chrono=($( cat $work_PATH/stem_list_sorted.tmp | awk '{print $3}' )) + + # cd $work_PATH/orig + cd $work_PATH/preprocessing/raw + + i=0 + for stem in ${stems_chrono[@]}; do + + ln -sf $work_PATH/orig/${files_chrono[$i]}/measurement/${stem}.tiff . + ln -sf $work_PATH/orig/${files_chrono[$i]}/annotation/${stem}.xml . + make_s1a_tops ${stem}.xml ${stem}.tiff ${stem} 0 + scene_prefix="${stem:15:8}_${stem:24:6}_F${swath}" + prefixes+=($scene_prefix) + + if [ "$debug" -ge 1 ]; then + echo "Stem $i: $stem" + echo "Timestamp: ${stem:24:6}" + echo "S1 file: ${files_chrono[$i]}"; echo fi - - ln -sf $work_PATH/orig/${S1_file[$i]}SAFE/annotation/*.xml . - ln -sf $work_PATH/orig/${S1_file[$i]}SAFE/measurement/*.tiff . - + ((i++)) + done + + stem=${stems_chrono[0]} + scene_prefix=${prefixes[0]} + if [ $debug -ge 1 ]; then + echo; echo "Main stem: $stem" + echo "Main scene prefix: $scene_prefix" fi + # Obtain radar coordinates for area of interest coordinates (s. config file) - orbit_counter=1 - orbit_match="none" - - for orbit in $orbit_list; do - if [ ! -z "$orbit" ] && [ "${orbit:42:8}" != " " ]; then - date_string="${orbit:42:8} ${orbit:51:2}:${orbit:53:2}:${orbit:55:2}" - orbit_startdate=$( date -d "$date_string" '+%s' ) - orbit_starttime=${orbit:34:6} - orbit_sensor=${orbit:0:3} - - if [ "$debug" -eq 2 ]; then - echo "Now working on orbit #: $orbit_counter - $orbit" - echo 'Orbit sensor: ' $orbit_sensor - echo 'Orbit start date: ' $orbit_startdate - echo 'Orbit start time: ' $orbit_starttime - fi - - if [ "$orbit_sensor" == "$target_sensor" ]; then - if [ -z ${prev_orbit_startdate} ] || [ -z ${orbit_startdate} ]; then - echo "Orbit date not configured properly ($prev_orbit_startdate - $orbit_startdate)... Skipping." - prev_orbit=$orbit - prev_orbit_startdate=$orbit_startdate - else - if [ "$target_date" -ge "$prev_orbit_startdate" ] && [ "$target_date" -lt "$orbit_startdate" ]; then - # Looks like we found a matching orbit - # TODO: perform further checks, e.g. end_date overlap - - orbit_match=$prev_orbit - echo "Found matching orbit file: $orbit_match" - ln -sf $orbits_PATH/$orbit_match $work_PATH/raw/ - echo $orbit_match - break - else - # No match again, get prepared for another round - prev_orbit=$orbit - prev_orbit_startdate=$orbit_startdate - fi - - fi + azimuth_1=$( awk 'NR==1' $work_PATH/boundary-box.xyz | SAT_llt2rat ${stem}.PRM 0 | awk '{print $2}' ) + azimuth_2=$( awk 'NR==2' $work_PATH/boundary-box.xyz | SAT_llt2rat ${stem}.PRM 0 | awk '{print $2}' ) + azimuth_3=$( awk 'NR==3' $work_PATH/boundary-box.xyz | SAT_llt2rat ${stem}.PRM 0 | awk '{print $2}' ) + azimuth_4=$( awk 'NR==4' $work_PATH/boundary-box.xyz | SAT_llt2rat ${stem}.PRM 0 | awk '{print $2}' ) + + azimuth_min=$azimuth_1 + if [ $( echo "$azimuth_2 < $azimuth_min" | bc -l ) -eq 1 ]; then azimuth_min=$azimuth_2; fi + if [ $( echo "$azimuth_3 < $azimuth_min" | bc -l ) -eq 1 ]; then azimuth_min=$azimuth_3; fi + if [ $( echo "$azimuth_4 < $azimuth_min" | bc -l ) -eq 1 ]; then azimuth_min=$azimuth_4; fi + + azimuth_max=$azimuth_1 + if [ $( echo "$azimuth_2 > $azimuth_max" | bc -l ) -eq 1 ]; then azimuth_max=$azimuth_2; fi + if [ $( echo "$azimuth_3 > $azimuth_max" | bc -l ) -eq 1 ]; then azimuth_max=$azimuth_3; fi + if [ $( echo "$azimuth_4 > $azimuth_max" | bc -l ) -eq 1 ]; then azimuth_max=$azimuth_4; fi - fi - fi - - - ((orbit_counter++)) + if [ "$debug" -ge 1 ]; then + echo "Minimum azimuth in AOI is $azimuth_min" + echo "Maximum azimuth in AOI is $azimuth_max" + fi + + # Assemble TOPS, omit burst outside AOI + stem_count=${#stems_chrono[@]} + stem_string="" + for((i=0;i<$stem_count;++i)); do + stem_string="$stem_string ${stems_chrono[$i]}" done - - - # If a matching orbit was found -> Prepare files and add to respective data.in files - if [ ! "$orbit_match" == "none" ]; then - for swath in ${swaths_to_process[@]}; do - - # In case multiple slices per date were found, start the merge & cut procedure - if [ "${#multiple_slices[@]}" -ge 1 ]; then - - cd $work_PATH/orig/$current_file/annotation/ - name_stem=$(ls *iw$swath*); name_stem=${name_stem::-4} - cd $work_PATH/orig/$prev_file/annotation/ - prev_name_stem=$(ls *iw$swath*); prev_name_stem=${prev_name_stem::-4} - - if [ ${name_stem:24:6} -gt ${prev_name_stem:24:6} ]; then - stem_1=$name_stem - file_1=$current_file - stem_2=$prev_name_stem - file_2=$prev_file - else - stem_1=$prev_name_stem - file_1=$prev_file - stem_2=$name_stem - file_2=$current_file - fi - - cd $work_PATH/orig - - # Obtain radar coordinates for area of interest coordinates (s. config file) - - ln -sf $work_PATH/orig/$file_1/measurement/${stem_1}.tiff . - ln -sf $work_PATH/orig/$file_1/annotation/${stem_1}.xml . - ln -sf $work_PATH/orig/$file_2/measurement/${stem_2}.tiff . - ln -sf $work_PATH/orig/$file_2/annotation/${stem_2}.xml . - - make_s1a_tops ${stem_1}.xml ${stem_1}.tiff ${stem_1} 0 - make_s1a_tops ${stem_2}.xml ${stem_2}.tiff ${stem_2} 0 - - - # Read radar coordinates for AoI - azimuth_1=$( awk 'NR==1' $work_PATH/boundary-box.xyz | SAT_llt2rat ${stem_2}.PRM 0 | awk '{print $2}' ) - azimuth_2=$( awk 'NR==2' $work_PATH/boundary-box.xyz | SAT_llt2rat ${stem_2}.PRM 0 | awk '{print $2}' ) - azimuth_3=$( awk 'NR==3' $work_PATH/boundary-box.xyz | SAT_llt2rat ${stem_2}.PRM 0 | awk '{print $2}' ) - azimuth_4=$( awk 'NR==4' $work_PATH/boundary-box.xyz | SAT_llt2rat ${stem_2}.PRM 0 | awk '{print $2}' ) - - azimuth_min=$azimuth_1 - if [ $( echo "$azimuth_2 < $azimuth_min" | bc -l ) -eq 1 ]; then azimuth_min=$azimuth_2; fi - if [ $( echo "$azimuth_3 < $azimuth_min" | bc -l ) -eq 1 ]; then azimuth_min=$azimuth_3; fi - if [ $( echo "$azimuth_4 < $azimuth_min" | bc -l ) -eq 1 ]; then azimuth_min=$azimuth_4; fi - - azimuth_max=$azimuth_1 - if [ $( echo "$azimuth_2 > $azimuth_max" | bc -l ) -eq 1 ]; then azimuth_max=$azimuth_2; fi - if [ $( echo "$azimuth_3 > $azimuth_max" | bc -l ) -eq 1 ]; then azimuth_max=$azimuth_3; fi - if [ $( echo "$azimuth_4 > $azimuth_max" | bc -l ) -eq 1 ]; then azimuth_max=$azimuth_4; fi - - - if [ "$debug" -ge 1 ]; then - echo "Stem 1: $stem_1" - echo "Stem 2: $stem_2"; echo - echo "current id: ${name_stem:24:6}" - echo "previous id: ${prev_name_stem:24:6}"; echo - echo "Minimum azimuth in AOI is $azimuth_min" - echo "Maximum azimuth in AOI is $azimuth_max" - fi - - # Assemble TOPS, omit burst outside AOI - assemble_tops $azimuth_min $azimuth_max $stem_2 $stem_1 $work_PATH/preprocessing/$stem_2 - - - prefix_1="${stem_1:15:8}_${stem_1:24:6}_F${swath}" - prefix_2="${stem_2:15:8}_${stem_2:24:6}_F${swath}" - - cd $work_PATH/preprocessing/ - - # Generate new PRM files for assembled tops - make_s1a_tops ${stem_2}.xml ${stem_2}.tiff S1_$prefix_2 0 - - # Generate LED files for assembled tops - if [ "$debug" -ge 1 ]; then echo "Executing ext_orb_s1a with option ${stem_2}.PRM $orbit_match ../$prefix_2"; fi - ext_orb_s1a S1_${prefix_2}.PRM $orbits_PATH/$orbit_match S1_$prefix_2 - - # Prepare data in raw folder for subsequent processing steps ... - cd $work_PATH/raw/ - ln -sf $work_PATH/preprocessing/${stem_2}.xml . - ln -sf $work_PATH/preprocessing/${stem_2}.tiff . - - fi - - if [ "${#multiple_slices[@]}" -ge 1 ]; then - stem_name=${stem_2} - else - stem_name=${swath_names[$swath]} - fi + if [ $debug -ge 1 ]; then + echo; echo "Executing assemble_tops with parameters:" + echo "$azimuth_min $azimuth_max $stem_string $work_PATH/preprocessing/$stem" + fi + assemble_tops $azimuth_min $azimuth_max $stem_string $work_PATH/preprocessing/$stem - # Write to data_in file - # Check if single_master mode and current scene is master scene - if [ $process_intf_mode = "single_master" ]; then - echo - echo "Target date: ${target_scene:17:8}" - echo "Master scene date: $master_scene_date" - echo - if [ "$master_scene_date" = "${target_scene:17:8}" ]; then - echo "${stem_name}:$orbit_match" >> $work_PATH/raw/data_sm_swath$swath.master - else - echo "${stem_name}:$orbit_match" >> $work_PATH/raw/data_sm_swath$swath.tmp - fi - else - echo "${stem_name:15:8}-${stem_name}:$orbit_match" >> $work_PATH/raw/data_swath$swath.tmp - fi - - done - else - echo "No matching orbit available for date ${target_scene:17:8}. Skipping ..." + cd $work_PATH/preprocessing/ + + # Generate new PRM files for assembled tops + if [ $debug -ge 1 ]; then + echo; echo "Executing make_s1a_tops with parameters:" + echo "${stem}.xml ${stem}.tiff S1_${scene_prefix} 0" + fi + make_s1a_tops ${stem}.xml ${stem}.tiff S1_${scene_prefix} 0 + + # Generate LED files for assembled tops + if [ "$debug" -ge 1 ]; then + echo; echo "Executing ext_orb_s1a with parameter:" + echo "S1_${scene_prefix}.PRM $orbits_PATH/$orbit_match S1_$scene_prefix" fi + ext_orb_s1a S1_${scene_prefix}.PRM $orbits_PATH/$orbit_match S1_$scene_prefix + + # Prepare data in raw folder for subsequent processing steps ... + cd $work_PATH/raw/ + ln -sf $work_PATH/preprocessing/${stem}.xml . + ln -sf $work_PATH/preprocessing/${stem}.tiff . + + + # Write to data_in file + # Check if single_master mode and current scene is master scene + if [ $process_intf_mode = "single_master" ]; then + echo + echo "Target date: ${S1_files[$i]:17:8}" + echo "Master scene date: $master_scene_date" + echo + if [ "$master_scene_date" = "${S1_files[$i]:17:8}" ]; then + echo "${stem}:$orbit_match" >> $work_PATH/raw/data_sm_swath$swath.master + else + echo "${stem}:$orbit_match" >> $work_PATH/raw/data_sm_swath$swath.tmp + fi + else + echo "${stem:15:8}-${stem}:$orbit_match" >> $work_PATH/raw/data_swath$swath.tmp + fi + + done + else + echo "No matching orbit available. Skipping ..." + if [ $debug -ge 1 ]; then + echo "Orbits found:" + echo "${orbit_match[@]}" fi fi - cp -n $work_PATH/orig/$current_file/manifest.safe $work_PATH/raw/${current_file:17:8}_manifest.safe - - prev_file=$current_file - - ((i++)) + done + - done < $filelist_PATH if [ "$clean_up" -ge 1 ]; then rm -r $work_PATH/preprocessing/filelists/ diff --git a/lib/process-pairs.sh b/lib/process-pairs.sh index e86ed83..f762ee9 100755 --- a/lib/process-pairs.sh +++ b/lib/process-pairs.sh @@ -60,29 +60,27 @@ else if [ "$2" = "SM" ]; then data_in_file=data_sm_swath${swath}.in mode="SM" + echo "Processing mode: Single Master" elif [ "$2" = "CMP" ]; then data_in_file=data_swath${swath}.in mode="CMP" + echo "Processing mode: Chronologically Moving Pairs" else echo "No processing mode specified. Processing in 'chronologically moving pairs' mode." mode="CMP" fi - if [ $debug -gt 0 ]; then - echo "Data in file: $data_in_file" - fi + echo; echo "Reading scenes and orbits from $data_in_file" while read -r dataline; do - cd $work_PATH/raw/ - echo - echo - echo "Reading scenes and orbits from file data.in" + cd $work_PATH/raw/ ((dataline_count++)) current_scene=${dataline:0:64} current_orbit=${dataline:65:77} - echo "Current scene: $current_scene" - echo "Current orbit: $current_orbit" + echo; echo "Adding data ..." + echo "Scene: $current_scene" + echo "Orbit: $current_orbit" start_processing=1 @@ -129,6 +127,9 @@ else orbit_1=$previous_orbit scene_2=$current_scene orbit_2=$current_orbit + if [ $debug -ge 1 ]; then + echo; echo "Scene 1: ${scene_1:15:8} - Scene 2: ${scene_2:15:8}" + fi if [ "${scene_1:15:8}" -gt "${scene_2:15:8}" ]; then start_processing=0 echo "Scenes ${scene_1:15:8} is equal or greater than ${scene_2:15:8}. Skipping ..." @@ -141,7 +142,7 @@ else echo "$scene_pair_name" >> $work_PATH/pairs-forward.list echo "Creating directory ${scene_pair_name}-aligned" - mkdir -pv $work_PATH/raw/$scene_pair_name-aligned; cd $work_PATH/raw/$scene_pair_name-aligned + mkdir -p $work_PATH/raw/$scene_pair_name-aligned; cd $work_PATH/raw/$scene_pair_name-aligned ln -sf $topo_PATH/dem.grd . ln -sf $work_PATH/raw/${scene_1:15:8}_manifest.safe . ln -sf $work_PATH/raw/${scene_2:15:8}_manifest.safe . diff --git a/lib/s1-file-download.sh b/lib/s1-file-download.sh index 2e9c732..cd9abb1 100644 --- a/lib/s1-file-download.sh +++ b/lib/s1-file-download.sh @@ -115,8 +115,8 @@ if [ "$aoi_ok" -eq 1 ] && [ "$login_ok" -eq 1 ]; then if [ $debug -ge 1 ]; then echo; echo "Files to download:" - for ASF_file in ${ASF_files[@]} ]; do - echo "${ASF_files[@]}"; echo + for ASF_file in ${ASF_files[@]}; do + echo "${ASF_file}"; echo done fi diff --git a/modules/gacos_correction/PP-gacos-correction.sh b/modules/gacos_correction/PP-gacos-correction.sh index 44be3e9..a652afa 100755 --- a/modules/gacos_correction/PP-gacos-correction.sh +++ b/modules/gacos_correction/PP-gacos-correction.sh @@ -127,11 +127,11 @@ gmt grdmath $GACOS_work_PATH/cut_intfs/$intf ${szpddm_file::-4}-cut.grd SUB = $c -if [ -f $UCM_output_PATH/$UCM_file ]; then status_UCM=1; else status_UCM=0; fi +if [ -f $corrected_phase_file ]; then status_GACOS=1; else status_GACOS=0; fi end=`date +%s` runtime=$((end-start)) -echo "${high_corr_file:7:8}-${high_corr_file:30:8} ${corr_file:7:8}-${corr_file:30:8} $SLURM_JOB_ID $runtime $status_UCM" >> $output_PATH/Reports/PP-UCM-stats.tmp +echo "${intf:0:8} ${intf:10:8} $SLURM_JOB_ID $runtime $status_GACOS" >> $output_PATH/Reports/PP-GACOS-stats.tmp printf 'Processing finished in %02dd %02dh:%02dm:%02ds\n' $(($runtime/86400)) $(($runtime%86400/3600)) $(($runtime%3600/60)) $(($runtime%60)) diff --git a/modules/summary_pdf/PP-summary-pdf.sh b/modules/summary_pdf/PP-summary-pdf.sh new file mode 100755 index 0000000..7ce5a61 --- /dev/null +++ b/modules/summary_pdf/PP-summary-pdf.sh @@ -0,0 +1,176 @@ +#!/usr/bin/env bash + +start=`date +%s` + +config_file=$1 +work_PATH=$2 +pair_id=$3 + +echo "Config file: $config_file" +echo "Work path: $work_PATH" +echo "Pair ID: $pair_id" + +source $config_file + +# Convert dataset configuration to arrays +labels=( "$LABEL_1" "$LABEL_2" "$LABEL_3" "$LABEL_4" ) +directories=( "$DIRECTORY_1" "$DIRECTORY_2" "$DIRECTORY_3" "$DIRECTORY_4" ) +histeqs=( "$HIST_EQ_1" "$HIST_EQ_2" "$HIST_EQ_3" "$HIST_EQ_4" ) +cpts=( $CPT_1 $CPT_2 $CPT_3 $CPT_4 ) +ranges=( $RANGE_1 $RANGE_2 $RANGE_3 $RANGE_4 ) +show_suppls=( $SHOW_SUPPL_1 $SHOW_SUPPL_2 $SHOW_SUPPL_3 $SHOW_SUPPL_4 ) + +dem_grd_hs="$work_PATH/Summary/hillshade.grd" +CPDFS_dem="$work_PATH/Summary/CPDFS_dem.grd" +CPDFS_dem_HS="$work_PATH/Summary/CPDFS_dem_HS.grd" + +ps_base="$work_PATH/Summary/${pair_id}-grd" +histeq_base="$work_PATH/Summary/${pair_id}-hiq" +pdf_merged="$work_PATH/Summary/${pair_id}-combined.pdf" +pdf_merged_ROT90=${pdf_merged::-4}_rot90.png + + +# Set GMT parameters +gmt gmtset MAP_FRAME_PEN 3 +gmt gmtset MAP_FRAME_WIDTH 0.1 +gmt gmtset MAP_FRAME_TYPE plain +gmt gmtset FONT_TITLE Helvetica-Bold +gmt gmtset FONT_LABEL Helvetica-Bold 14p +gmt gmtset PS_PAGE_ORIENTATION landscape +gmt gmtset PS_MEDIA A4 +gmt gmtset FORMAT_GEO_MAP D +gmt gmtset MAP_DEGREE_SYMBOL degree +gmt gmtset PROJ_LENGTH_UNIT cm + + + +if [ ! -f "$pdf_merged_ROT90" ]; then + + GRD_FAIL=( 0 0 0 0 ) + + for counter in 0 1 2 3; do + echo "Preparing ${labels[$counter]} ..." + echo "Searching for files in ${directories[$counter]}" + cpt_files[$counter]="$work_PATH/Summary/grd_${counter}_color.cpt" + cd ${directories[$counter]} + ls_result=$( ls ${pair_id}*.grd ) + echo "Found file $ls_result" + if [ -f $ls_result ]; then + GRD[$counter]="${directories[$counter]}/$ls_result" + echo "${labels[$counter]} file found: ${GRD[$counter]}" + + echo "histeqs $counter : ${histeqs[$counter]}" + + if [ ${histeqs[$counter]} -eq "1" ]; then + if [ ! -f ${histeq_base}-$counter.grd ]; then + echo; echo "Calculate histogram equalization for ${GRD[$counter]}" + gmt grdhisteq ${GRD[$counter]} -G${histeq_base}-$counter.grd -N -V + gmt grd2cpt -E15 ${histeq_base}-$counter.grd -C${cpts[$counter]} -V > $work_PATH/Summary/grd_${counter}.cpt + else + echo; echo "${labels[$counter]} histogram exists, skipping ..."; echo + fi + fi + else + echo "No ${labels[$counter]} file found" + GRD_FAIL[$counter]=1 + GRD_MESSAGE[$counter]="No ${labels[$counter]} file" + fi + done + + + + cd $work_PATH/Summary + + SCALE=18 + XSTEPS=0.5 + YSTEPS=0.5 + + if [ ! -e $pdf_merged ]; then + for counter in 0 1 2 3; do + if [ ! -e ${ps_base}-$counter.ps ]; then + echo; echo "Creating ${labels[$counter]} in ${ps_base}-$counter.ps" + TITLE="${labels[$counter]} {master_date}" + if [ ! "${GRD_FAIL[$counter]}" -eq 1 ]; then + if [ ${histeqs[$counter]} -eq 1 ]; then + echo; echo "${labels[$counter]}: ${histeq_base}-$counter.grd"; echo + gmt grdimage ${histeq_base}-$counter.grd \ + -C$work_PATH/Summary/grd_${counter}.cpt -R$AOI_REGION -JM$SCALE -B+t"$TITLE" -Q \ + -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${ps_base}-$counter.ps + else + gmt grdimage ${GRD[$counter]} \ + -C${cpt_files[$counter]} -R$AOI_REGION -JM$SCALE -B+t"$TITLE" -Q \ + -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${ps_base}-$counter.ps + fi + + if [ ${show_suppls[$counter]} -eq 1 ]; then + for vector_file in ${vector_files[@]}; do + style_name=${vector_file}_style + vector_style=$( echo "${!style_name}" | tr -d "'" ) + gmt psxy $vector_style -JM$SCALE -R$AOI_REGION ${!vector_file::-4}.gmt -O -K -V >> ${ps_base}-$counter.ps + done + fi + else + gmt grdimage $CPDFS_dem_HS -C#ffffff,#eeeeee \ + -R$AOI_REGION -JM$SCALE -B+t"${GRD_MESSAGE[$counter]}" -Q -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${ps_base}-$counter.ps + + + if [ $page_orientation -eq 1 ]; then + convert -density $resolution -fill red -pointsize 18 -gravity center \ + -trim -verbose label:"${GRD_MESSAGE[$counter]}" \ + ${ps_base}-$counter.ps -quality 100 ${ps_base}-$counter.ps + else + convert -rotate 90 -density $resolution -fill red -pointsize 18 -gravity center \ + -trim -verbose label:"${GRD_MESSAGE[$counter]}" \ + ${ps_base}-$counter.ps -quality 100 ${ps_base}-$counter.ps + fi + fi + + if [ $page_orientation -eq 1 ]; then + convert -verbose -density $resolution -trim ${ps_base}-$counter.ps -quality 100 ${ps_base}-$counter.png + else + convert -verbose -rotate 90 -density $resolution -trim ${ps_base}-$counter.ps -quality 100 ${ps_base}-$counter.png + fi + else + echo; echo "${labels[$counter]} in ${ps_base}-$counter.ps exists, skipping ..." + fi + done + + + echo "Merging PS into $pdf_merged_ROT90" + take_diff=$(( ($(date --date="$slave_date" +%s) - $(date --date="$master_date" +%s) )/(60*60*24) )) + if [ "$page_orientation" -eq 1 ]; then + montage ${ps_base}-0.png ${ps_base}-1.png ${ps_base}-2.png ${ps_base}-3.png \ + -rotate 90 -geometry +100+150 -density $resolution -title "${pair_id} (${take_diff} days)" \ + -quality 100 -tile 4x1 -mode concatenate -verbose $pdf_merged_ROT90 + else + montage -tile 1x4 -geometry +20+30 \ + ${ps_base}-0.png ${ps_base}-1.png ${ps_base}-2.png ${ps_base}-3.png \ + -title "${pair_id} (${take_diff} days)" \ + -density $resolution -quality 100 -mode concatenate -verbose $pdf_merged_ROT90 + fi + + + if [ "$clean_up" -ge 1 ]; then + rm ${ps_base}_*.ps + rm ${ps_base}_*.png + fi + + if [ "${histeq_base}-$counter.grd" != "$CPDFS_dem_HS" ]; then + rm ${histeq_base}-*.grd + fi + fi +else + echo "File $pdf_merged_ROT90 exists, skipping ..." +fi + + +if [ -f $pdf_merged_ROT90 ]; then status_SUMMARY=1; else status_SUMMARY=0; fi + +end=`date +%s` +runtime=$((end-start)) + +echo "${pair_id:0:8} ${pair_id:10:8} $SLURM_JOB_ID $runtime $status_SUMMARY" >> $output_PATH/Reports/PP-SUMMARY-stats.tmp + +printf 'Processing finished in %02dd %02dh:%02dm:%02ds\n' $(($runtime/86400)) $(($runtime%86400/3600)) $(($runtime%3600/60)) $(($runtime%60)) + + diff --git a/modules/summary_pdf/summary_pdf.sh b/modules/summary_pdf/summary_pdf.sh index e97a14a..7b25d78 100755 --- a/modules/summary_pdf/summary_pdf.sh +++ b/modules/summary_pdf/summary_pdf.sh @@ -254,129 +254,152 @@ else slave_date=${intf_pair:10:8} printf "\n \n Now working on results for \n Master date: $master_date \n Slave date: $slave_date \n \n" - PS_BASE="$work_PATH/Summary/${master_date}-${slave_date}-grd" - HISTEQ_BASE="$work_PATH/Summary/${master_date}-${slave_date}-hiq" - PDF_MERGED="$work_PATH/Summary/${master_date}-${slave_date}-combined.pdf" - PDF_MERGED_ROT90=${PDF_MERGED::-4}_rot90.png - - if [ ! -f "$PDF_MERGED_ROT90" ]; then - - GRD_FAIL=( 0 0 0 0 ) - - for counter in 0 1 2 3; do - echo "Preparing ${LABELS[$counter]} ..." - # echo "${DIRECTORIES[$counter]}" - cd ${DIRECTORIES[$counter]} - ls_result=$( ls ${intf_pair}*.grd ) - if [ -f $ls_result ]; then - GRD[$counter]="${DIRECTORIES[$counter]}/$ls_result" - echo "${LABELS[$counter]} file found: ${GRD[$counter]}" - - echo "HISTEQS $counter : ${HISTEQS[$counter]}" - - if [ ${HISTEQS[$counter]} -eq "1" ]; then - if [ ! -f ${HISTEQ_BASE}-$counter.grd ]; then - echo; echo "Calculate histogram equalization for ${GRD[$counter]}" - gmt grdhisteq ${GRD[$counter]} -G${HISTEQ_BASE}-$counter.grd -N -V - gmt grd2cpt -E15 ${HISTEQ_BASE}-$counter.grd -C${CPTS[$counter]} -V > $work_PATH/Summary/grd_${counter}.cpt - else - echo; echo "${LABELS[$counter]} histogram exists, skipping ..."; echo - fi - fi - else - echo "No ${LABELS[$counter]} file found" - GRD_FAIL[$counter]=1 - GRD_MESSAGE[$counter]="No ${LABELS[$counter]} file" - fi - done + if [ "$activate_PP" -eq 1 ]; then + echo; echo "Starting parallel processing job for Summary PDF of:" + echo "$intf_pair" + + slurm_jobname="$slurm_jobname_prefix-GACOS" + + sbatch \ + --output=$log_PATH/summary-%j.log \ + --error=$log_PATH/summary-%j.log \ + --workdir=$work_PATH/Summary \ + --job-name=$slurm_jobname \ + --qos=$slurm_qos \ + --account=$slurm_account \ + --partition=$slurm_partition \ + --mail-type=$slurm_mailtype \ + $OSARIS_PATH/modules/summary_pdf/PP-summary-pdf.sh \ + ${module_config_PATH}/${module_name}.config \ + $work_PATH \ + $intf_pair + else + PS_BASE="$work_PATH/Summary/${master_date}-${slave_date}-grd" + HISTEQ_BASE="$work_PATH/Summary/${master_date}-${slave_date}-hiq" + PDF_MERGED="$work_PATH/Summary/${master_date}-${slave_date}-combined.pdf" + PDF_MERGED_ROT90=${PDF_MERGED::-4}_rot90.png - cd $work_PATH/Summary - - SCALE=18 - XSTEPS=0.5 - YSTEPS=0.5 - - if [ ! -e $PDF_MERGED ]; then - for counter in 0 1 2 3; do - if [ ! -e ${PS_BASE}-$counter.ps ]; then - echo; echo "Creating ${LABELS[$counter]} in ${PS_BASE}-$counter.ps" - TITLE="${LABELS[$counter]} {master_date}" - if [ ! "${GRD_FAIL[$counter]}" -eq 1 ]; then - if [ ${HISTEQS[$counter]} -eq 1 ]; then - echo; echo "${LABELS[$counter]}: ${HISTEQ_BASE}-$counter.grd"; echo - gmt grdimage ${HISTEQ_BASE}-$counter.grd \ - -C$work_PATH/Summary/grd_${counter}.cpt -R$AOI_REGION -JM$SCALE -B+t"$TITLE" -Q \ - -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${PS_BASE}-$counter.ps + if [ ! -f "$PDF_MERGED_ROT90" ]; then + + GRD_FAIL=( 0 0 0 0 ) + + for counter in 0 1 2 3; do + echo; echo "Preparing ${LABELS[$counter]} ..." + + cd ${DIRECTORIES[$counter]} + ls_result=$( ls ${intf_pair}*.grd ) + if [ -f $ls_result ]; then + GRD[$counter]="${DIRECTORIES[$counter]}/$ls_result" + echo "${LABELS[$counter]} file found: ${GRD[$counter]}" + + echo "HISTEQS $counter : ${HISTEQS[$counter]}" + + if [ ${HISTEQS[$counter]} -eq "1" ]; then + if [ ! -f ${HISTEQ_BASE}-$counter.grd ]; then + echo; echo "Calculate histogram equalization for ${GRD[$counter]}" + gmt grdhisteq ${GRD[$counter]} -G${HISTEQ_BASE}-$counter.grd -N -V + gmt grd2cpt -E15 ${HISTEQ_BASE}-$counter.grd -C${CPTS[$counter]} -V > $work_PATH/Summary/grd_${counter}.cpt else - gmt grdimage ${GRD[$counter]} \ - -C${CPT_FILES[$counter]} -R$AOI_REGION -JM$SCALE -B+t"$TITLE" -Q \ - -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${PS_BASE}-$counter.ps + echo; echo "${LABELS[$counter]} histogram exists, skipping ..."; echo fi - - if [ ${SHOW_SUPPLS[$counter]} -eq 1 ]; then - for vector_file in ${vector_files[@]}; do - style_name=${vector_file}_style - vector_style=$( echo "${!style_name}" | tr -d "'" ) - gmt psxy $vector_style -JM$SCALE -R$AOI_REGION ${!vector_file::-4}.gmt -O -K -V >> ${PS_BASE}-$counter.ps - done + fi + else + echo "No ${LABELS[$counter]} file found" + GRD_FAIL[$counter]=1 + GRD_MESSAGE[$counter]="No ${LABELS[$counter]} file" + fi + done + + + + cd $work_PATH/Summary + + SCALE=18 + XSTEPS=0.5 + YSTEPS=0.5 + + if [ ! -e $PDF_MERGED ]; then + for counter in 0 1 2 3; do + if [ ! -e ${PS_BASE}-$counter.ps ]; then + echo; echo "Creating ${LABELS[$counter]} in ${PS_BASE}-$counter.ps" + TITLE="${LABELS[$counter]} {master_date}" + if [ ! "${GRD_FAIL[$counter]}" -eq 1 ]; then + if [ ${HISTEQS[$counter]} -eq 1 ]; then + echo; echo "${LABELS[$counter]}: ${HISTEQ_BASE}-$counter.grd"; echo + gmt grdimage ${HISTEQ_BASE}-$counter.grd \ + -C$work_PATH/Summary/grd_${counter}.cpt -R$AOI_REGION -JM$SCALE -B+t"$TITLE" -Q \ + -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${PS_BASE}-$counter.ps + else + gmt grdimage ${GRD[$counter]} \ + -C${CPT_FILES[$counter]} -R$AOI_REGION -JM$SCALE -B+t"$TITLE" -Q \ + -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${PS_BASE}-$counter.ps + fi + + if [ ${SHOW_SUPPLS[$counter]} -eq 1 ]; then + for vector_file in ${vector_files[@]}; do + style_name=${vector_file}_style + vector_style=$( echo "${!style_name}" | tr -d "'" ) + gmt psxy $vector_style -JM$SCALE -R$AOI_REGION ${!vector_file::-4}.gmt -O -K -V >> ${PS_BASE}-$counter.ps + done + fi + else + gmt grdimage $CPDFS_dem_HS -C#ffffff,#eeeeee \ + -R$AOI_REGION -JM$SCALE -B+t"${GRD_MESSAGE[$counter]}" -Q -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${PS_BASE}-$counter.ps + + + if [ $page_orientation -eq 1 ]; then + convert -density $resolution -fill red -pointsize 18 -gravity center \ + -trim -verbose label:"${GRD_MESSAGE[$counter]}" \ + ${PS_BASE}-$counter.ps -quality 100 ${PS_BASE}-$counter.ps + else + convert -rotate 90 -density $resolution -fill red -pointsize 18 -gravity center \ + -trim -verbose label:"${GRD_MESSAGE[$counter]}" \ + ${PS_BASE}-$counter.ps -quality 100 ${PS_BASE}-$counter.ps + fi fi - else - gmt grdimage $CPDFS_dem_HS -C#ffffff,#eeeeee \ - -R$AOI_REGION -JM$SCALE -B+t"${GRD_MESSAGE[$counter]}" -Q -Bx$XSTEPS -By$YSTEPS -V -K -Yc -Xc > ${PS_BASE}-$counter.ps - if [ $page_orientation -eq 1 ]; then - convert -density $resolution -fill red -pointsize 18 -gravity center \ - -trim -verbose label:"${GRD_MESSAGE[$counter]}" \ - ${PS_BASE}-$counter.ps -quality 100 ${PS_BASE}-$counter.ps + convert -verbose -density $resolution -trim ${PS_BASE}-$counter.ps -quality 100 ${PS_BASE}-$counter.png else - convert -rotate 90 -density $resolution -fill red -pointsize 18 -gravity center \ - -trim -verbose label:"${GRD_MESSAGE[$counter]}" \ - ${PS_BASE}-$counter.ps -quality 100 ${PS_BASE}-$counter.ps + convert -verbose -rotate 90 -density $resolution -trim ${PS_BASE}-$counter.ps -quality 100 ${PS_BASE}-$counter.png fi - fi - - if [ $page_orientation -eq 1 ]; then - convert -verbose -density $resolution -trim ${PS_BASE}-$counter.ps -quality 100 ${PS_BASE}-$counter.png else - convert -verbose -rotate 90 -density $resolution -trim ${PS_BASE}-$counter.ps -quality 100 ${PS_BASE}-$counter.png - fi - else - echo; echo "${LABELS[$counter]} in ${PS_BASE}-$counter.ps exists, skipping ..." - fi - done + echo; echo "${LABELS[$counter]} in ${PS_BASE}-$counter.ps exists, skipping ..." + fi + done - - echo "Merging PS into $PDF_MERGED_ROT90" - take_diff=$(( ($(date --date="$slave_date" +%s) - $(date --date="$master_date" +%s) )/(60*60*24) )) - if [ "$page_orientation" -eq 1 ]; then - montage ${PS_BASE}-0.png ${PS_BASE}-1.png ${PS_BASE}-2.png ${PS_BASE}-3.png \ - -rotate 90 -geometry +100+150 -density $resolution -title "${master_date}-${slave_date} (${take_diff} days)" \ - -quality 100 -tile 4x1 -mode concatenate -verbose $PDF_MERGED_ROT90 - else - montage -tile 1x4 -geometry +20+30 \ - ${PS_BASE}-0.png ${PS_BASE}-1.png ${PS_BASE}-2.png ${PS_BASE}-3.png \ - -title "${master_date}-${slave_date} (${take_diff} days)" \ - -density $resolution -quality 100 -mode concatenate -verbose $PDF_MERGED_ROT90 - fi + + echo "Merging PS into $PDF_MERGED_ROT90" + take_diff=$(( ($(date --date="$slave_date" +%s) - $(date --date="$master_date" +%s) )/(60*60*24) )) + if [ "$page_orientation" -eq 1 ]; then + montage ${PS_BASE}-0.png ${PS_BASE}-1.png ${PS_BASE}-2.png ${PS_BASE}-3.png \ + -rotate 90 -geometry +100+150 -density $resolution -title "${master_date}-${slave_date} (${take_diff} days)" \ + -quality 100 -tile 4x1 -mode concatenate -verbose $PDF_MERGED_ROT90 + else + montage -tile 1x4 -geometry +20+30 \ + ${PS_BASE}-0.png ${PS_BASE}-1.png ${PS_BASE}-2.png ${PS_BASE}-3.png \ + -title "${master_date}-${slave_date} (${take_diff} days)" \ + -density $resolution -quality 100 -mode concatenate -verbose $PDF_MERGED_ROT90 + fi - if [ "$clean_up" -ge 1 ]; then - rm ${PS_BASE}_*.ps - rm ${PS_BASE}_*.png - fi - - if [ "${HISTEQ_BASE}-$counter.grd" != "$CPDFS_dem_HS" ]; then - rm ${HISTEQ_BASE}-*.grd + if [ "$clean_up" -ge 1 ]; then + rm ${PS_BASE}_*.ps + rm ${PS_BASE}_*.png + fi + + if [ "${HISTEQ_BASE}-$counter.grd" != "$CPDFS_dem_HS" ]; then + rm ${HISTEQ_BASE}-*.grd + fi fi + else + echo "File $PDF_MERGED_ROT90 exists, skipping ..." fi - else - echo "File $PDF_MERGED_ROT90 exists, skipping ..." + + ((CPDFS_count+1)) fi - - ((CPDFS_count+1)) done diff --git a/osaris.sh b/osaris.sh index e425cf3..89a919f 100755 --- a/osaris.sh +++ b/osaris.sh @@ -15,7 +15,7 @@ else echo echo " ╔══════════════════════════════════════════╗" echo " ║ ║" - echo " ║ OSARIS v. 0.7 ║" + echo " ║ OSARIS v. 0.7.2 ║" echo " ║ Open Source SAR Investigation System ║" echo " ║ ║" echo " ╚══════════════════════════════════════════╝" @@ -135,7 +135,7 @@ else input_PATH=$base_PATH/$prefix/Input/ mkdir -p $input_PATH - source $OSARIS_PATH/lib/s1-file-download.sh 2>&1 >>$logfile + source $OSARIS_PATH/lib/s1-file-download.sh 2>&1 >>$log_PATH/downloads.log echo; echo Downloading finished; echo - - - - - - - - - - - - - - - -; echo else @@ -185,7 +185,7 @@ else echo; echo - - - - - - - - - - - - - - - -; echo "Updating orbit data ..."; echo if [ "$orbit_provider" = "ESA" ]; then - source $OSARIS_PATH/lib/s1-orbit-download.sh $orbits_PATH 5 &>>$logfile + source $OSARIS_PATH/lib/s1-orbit-download.sh $orbits_PATH 5 &>>$log_PATH/downloads.log elif [ "$orbit_provider" = "ASF" ]; then if [ -z "$ASF_username" ] || [ -z "$ASF_password" ]; then echo; echo "ERROR: Missing ASF login credentials." diff --git a/templates/module-config/summary_pdf.config.template b/templates/module-config/summary_pdf.config.template index eda3b20..d3f63df 100644 --- a/templates/module-config/summary_pdf.config.template +++ b/templates/module-config/summary_pdf.config.template @@ -19,6 +19,10 @@ resolution=300 # Image resolution in dpi. # Default is 300 (print quality). +activate_PP=0 +# Activate Slurm-based parallel processing. +# Caution: This requires ImageMagick to be installed on the processing nodes. + # Datasets to display