Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Init files #54

Merged
merged 110 commits into from
Jul 24, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
110 commits
Select commit Hold shift + click to select a range
cd0a2b0
adding functionality to add additional seeding on resumes
shauntruelove May 4, 2023
991a3fd
fix seeding
shauntruelove May 4, 2023
de2d64c
fix seeding
shauntruelove May 4, 2023
5ad4c54
Merge branch 'seeding_fix' of https://github.com/HopkinsIDD/flepiMoP …
shauntruelove May 4, 2023
ed80f86
fixed dumb seeding error
shauntruelove May 5, 2023
b9ebfa5
fixing seeding
shauntruelove May 5, 2023
b8ab6b1
new seeding builder
shauntruelove May 5, 2023
7490b0b
more resilient to differnt shell (zsh, bash)
jcblemai May 8, 2023
2ee20b1
local flag to print all the variables to reproduce failed runs
jcblemai May 8, 2023
8cae554
fix seeding
shauntruelove May 10, 2023
5261542
fix typo in model output directory and limit hosp file
saraloo May 10, 2023
06c5937
Merge branch 'seeding_fix' of https://github.com/HopkinsIDD/flepiMoP …
saraloo May 10, 2023
8b3a2ac
add options in config to fix either resumed seeding or added seeding
shauntruelove May 17, 2023
f652db1
Merge branch 'seeding_fix' of https://github.com/HopkinsIDD/flepiMoP …
shauntruelove May 17, 2023
9bd4a56
fix parentheses typo added seeding
saraloo May 17, 2023
b57ccc8
fix seeding in config writer
shauntruelove May 17, 2023
2659f18
Merge branch 'seeding_fix' of https://github.com/HopkinsIDD/flepiMoP …
shauntruelove May 17, 2023
c9d427e
continatation envar
jcblemai May 18, 2023
db7d8ac
continuation default to resume
jcblemai May 18, 2023
7964f48
downloading continuation resume files
jcblemai May 18, 2023
91293a1
skip initial spaces for hand made csv
jcblemai May 22, 2023
96d2a22
fix
jcblemai May 22, 2023
f4bd654
single files for initial conditions and seeding
jcblemai May 23, 2023
7aabf78
black
jcblemai May 23, 2023
ace9ff4
fix
jcblemai May 23, 2023
fec65b6
removed deprecated files that use to print the results
jcblemai May 24, 2023
f4e9bd6
fix seeding issue if not using seeding
shauntruelove May 25, 2023
6cfa8de
fix seeding requirements in building data
shauntruelove May 25, 2023
19c1c25
fix another seeding issue
shauntruelove May 26, 2023
e821873
remove duplicate section
jcblemai May 26, 2023
fc66728
detect if seeding is in the config
jcblemai May 26, 2023
bb145d8
update check
jcblemai May 26, 2023
980949e
manual merge of seeding_fix
jcblemai May 26, 2023
afc28f2
mispushed
jcblemai May 26, 2023
64bc7da
still trying to merge
jcblemai May 26, 2023
b2def6b
fix typos
jcblemai May 26, 2023
c95a947
fix bug
jcblemai May 26, 2023
2036e75
sane error message
jcblemai May 26, 2023
9569b1c
inference slots calls the preprocessing of init files
jcblemai May 26, 2023
ef1ba3b
addin init building file
shauntruelove May 26, 2023
9be6539
Merge remote-tracking branch 'origin/seeding_fix' into init_files
jcblemai May 26, 2023
b423b00
syncing the init script to bash
jcblemai May 26, 2023
cc8692a
fix max date issue
shauntruelove May 26, 2023
be26ef7
Merge remote-tracking branch 'origin/seeding_fix' into init_files
jcblemai May 26, 2023
572b5f0
fix so it runs
jcblemai May 26, 2023
7472f2b
fix type error, comparing strs to strs
jcblemai May 26, 2023
118aba3
initial condition from file now working, with their two flags
jcblemai May 26, 2023
0b25056
typo
jcblemai May 26, 2023
129413d
fixed init file to include all compartments at 0 if missing
shauntruelove May 26, 2023
962cf59
more fixes
jcblemai May 26, 2023
f6dcb82
Merge remote-tracking branch 'origin/seeding_fix' into init_files
jcblemai May 26, 2023
5486c32
glue code
jcblemai May 27, 2023
63341e6
made the immune ladder use envar
jcblemai May 27, 2023
551eb67
hopefully not too badly done: init files moving (this is a mess)
jcblemai May 27, 2023
b92cab7
⚠️ hack to get R17 going
jcblemai May 27, 2023
1a5d6f9
message for continaution
jcblemai May 27, 2023
749c7ef
making initial condiiton folder draw safer: loads from each mc instea…
jcblemai May 30, 2023
797cb0d
a better check philosophy
jcblemai May 30, 2023
10fb91e
fixed init script
shauntruelove May 30, 2023
884f031
popnode check
jcblemai May 30, 2023
1ff19d0
Merge remote-tracking branch 'origin/seeding_fix' into init_files
jcblemai May 30, 2023
0f74180
fix vacc in init
shauntruelove May 30, 2023
f0f69a8
merge
jcblemai May 30, 2023
665e833
zsh safe
jcblemai May 30, 2023
53dcabc
disabling population check for R17
jcblemai May 30, 2023
3b07d62
new message
jcblemai May 30, 2023
6bd5e4f
fix deprecation warning (and actual error in the docker
jcblemai May 31, 2023
da0b34d
Merge branch 'init_files' of https://github.com/HopkinsIDD/flepiMoP i…
jcblemai May 31, 2023
6b4e63e
prune by llik
jcblemai May 31, 2023
f83f1be
fix
jcblemai Jun 2, 2023
23373a7
fix so flepi send even when the llik fails + fix timezone for filter
jcblemai Jun 2, 2023
a23296e
allow init script
shauntruelove Jun 3, 2023
22eac07
try fix for FCH : resume works without init
jcblemai Jun 5, 2023
dda251f
more fixes
jcblemai Jun 5, 2023
9694b8a
build conda env for rockfish
jcblemai Jun 5, 2023
50cca31
fix env
jcblemai Jun 5, 2023
a9dab13
fix: proposal were rejected on the basis of the probability density >…
jcblemai Jun 6, 2023
182c902
logfiles: better naming convention and making sure it also contains f…
jcblemai Jun 6, 2023
ff7dd23
fix: folder creation
jcblemai Jun 6, 2023
e44590a
fix: forgot namespace
jcblemai Jun 6, 2023
192ee0d
Adding pre-processing and init scripts
shauntruelove Jun 6, 2023
3b4895a
temporarily turned on init script and updated path
shauntruelove Jun 6, 2023
3559e98
fix post-processing scripts
jcblemai Jun 7, 2023
1a3fbe2
Merge branch 'init_files' of https://github.com/HopkinsIDD/flepiMoP i…
jcblemai Jun 7, 2023
7777de6
made postprocessing call explicit so it can be redone from logs
jcblemai Jun 7, 2023
9b5c1c6
send postprocessing plots to s3 and fs_results path
jcblemai Jun 7, 2023
696781b
add top and bottom 5 llik plots and cumulatives to snapshot
saraloo Jun 8, 2023
2594444
add preOm no delta init
shauntruelove Jun 8, 2023
d48d42c
Merge branch 'init_files' of https://github.com/HopkinsIDD/flepiMoP i…
shauntruelove Jun 8, 2023
61d5c16
cleaned for sara
jcblemai Jun 8, 2023
256b6b5
merge
jcblemai Jun 8, 2023
8b8af8f
remove dependency
jcblemai Jun 8, 2023
f60a5dd
make snpi plots bigger in snapshot
saraloo Jun 9, 2023
babaf3c
Merge branch 'init_files' of https://github.com/HopkinsIDD/flepiMoP i…
saraloo Jun 9, 2023
3c8e999
TO CHECK: does not perturb the first iteration as to not loose good r…
jcblemai Jun 9, 2023
624e8db
merge
jcblemai Jun 9, 2023
b97516d
fix post-processing uploaded
jcblemai Jun 9, 2023
174e6b4
add back hpar to snapshot
saraloo Jun 9, 2023
0c9477c
Merge branch 'init_files' of https://github.com/HopkinsIDD/flepiMoP i…
saraloo Jun 9, 2023
45ea23f
removed the init file script
jcblemai Jun 9, 2023
51b18d5
comment out extra data pull
shauntruelove Jun 13, 2023
6d9e18b
Merge branch 'init_files' of https://github.com/HopkinsIDD/flepiMoP i…
shauntruelove Jun 13, 2023
8c3ba6b
fix config.writer::seir_chunk glue::collapse back to paste
saraloo Jun 21, 2023
a528c5c
better functions for debug
jcblemai Jun 27, 2023
dc84998
fix variant error in buildcoviddata
shauntruelove Jul 10, 2023
f981cef
fixing Delphi API key. adding input options
shauntruelove Jul 13, 2023
5b8aec1
add Delphi messages
shauntruelove Jul 13, 2023
7884e2d
deleting problem line
shauntruelove Jul 13, 2023
74ed274
remove error line
shauntruelove Jul 13, 2023
514ef1d
Merge branch 'main-fixvarerr' into init_files
shauntruelove Jul 17, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
70 changes: 47 additions & 23 deletions batch/SLURM_inference_job.run
Original file line number Diff line number Diff line change
Expand Up @@ -28,34 +28,37 @@ which Rscript
export PATH=~/aws-cli/bin:$PATH
echo "***************** DONE LOADING ENVIRONMENT *****************"

# If running from zsh, this ensure the compatibility of using space separated words as bash array
setopt shwordsplit


echo "***************** FETCHING RESUME FILES *****************"
### In case of resume, download or move the right files
export LAST_JOB_OUTPUT=$(echo $LAST_JOB_OUTPUT | sed 's/\/$//')
if [ -n "$LAST_JOB_OUTPUT" ]; then # -n Checks if the length of a string is nonzero --> if LAST_JOB_OUTPUT is not empty, the we download the output from the last job
if [ $FLEPI_BLOCK_INDEX -eq 1 ]; then # always true for slurm submissions
if [[ -n "$LAST_JOB_OUTPUT" ]]; then # -n Checks if the length of a string is nonzero --> if LAST_JOB_OUTPUT is not empty, the we download the output from the last job
if [[ $FLEPI_BLOCK_INDEX -eq 1 ]]; then # always true for slurm submissions
export RESUME_RUN_INDEX=$OLD_FLEPI_RUN_INDEX
echo "RESUME_DISCARD_SEEDING is set to $RESUME_DISCARD_SEEDING"
if [ $RESUME_DISCARD_SEEDING == "true" ]; then
export PARQUET_TYPES="spar snpi hpar hnpi"
if [[ $RESUME_DISCARD_SEEDING == "true" ]]; then
export PARQUET_TYPES="spar snpi hpar hnpi init"
else
export PARQUET_TYPES="seed spar snpi hpar hnpi"
export PARQUET_TYPES="seed spar snpi hpar hnpi init"
fi
else # if we are not in the first block, we need to resume from the last job, with seeding an all.
export RESUME_RUN_INDEX=$FLEPI_RUN_INDEX
export PARQUET_TYPES="seed spar snpi seir hpar hnpi hosp llik"
export PARQUET_TYPES="seed spar snpi seir hpar hnpi hosp llik init"
fi
for filetype in $PARQUET_TYPES
do
if [ $filetype == "seed" ]; then
if [[ $filetype == "seed" ]]; then
export extension="csv"
else
export extension="parquet"
fi
for liketype in "global" "chimeric"
do
export OUT_FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/$liketype/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX-1,'$filetype','$extension'))")
if [ $FLEPI_BLOCK_INDEX -eq 1 ]; then
if [[ $FLEPI_BLOCK_INDEX -eq 1 ]]; then
export IN_FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$RESUME_RUN_INDEX','$FLEPI_PREFIX/$RESUME_RUN_INDEX/$liketype/final/',$FLEPI_SLOT_INDEX,'$filetype','$extension'))")
else
export IN_FILENAME=$OUT_FILENAME
Expand All @@ -69,18 +72,38 @@ if [ -n "$LAST_JOB_OUTPUT" ]; then # -n Checks if the length of a string is non
mkdir -p $OUT_FILENAME_DIR
cp $LAST_JOB_OUTPUT/$IN_FILENAME $OUT_FILENAME
fi
if [ -f $OUT_FILENAME ]; then
if [[ -f $OUT_FILENAME ]]; then
echo "Copy successful for file of type $filetype ($IN_FILENAME -> $OUT_FILENAME)"
else
echo "Could not copy file of type $filetype ($IN_FILENAME -> $OUT_FILENAME)"
if [ $liktype -eq "global" ]; then
exit 2
fi
fi
done
done
ls -ltr model_output
fi

if [[ $FLEPI_CONTINUATION == "TRUE" ]]; then
echo "We are doing a continuation"
export INIT_FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX-1,'$FLEPI_CONTINUATION_FTYPE','$extension'))")
# in filename is always a seir file
export IN_FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_CONTINUATION_RUN_ID','$FLEPI_PREFIX/$FLEPI_CONTINUATION_RUN_ID/global/final/',$FLEPI_SLOT_INDEX,'seir','$extension'))")
if [[ $FLEPI_CONTINUATION_LOCATION == *"s3://"* ]]; then
aws s3 cp --quiet $FLEPI_CONTINUATION_LOCATION/$IN_FILENAME $INIT_FILENAME
else
# cp does not create directorys, so we make the directories first
export $OUT_FILENAME_DIR="$(dirname "${INIT_FILENAME}")"
mkdir -p $OUT_FILENAME_DIR
cp $FLEPI_CONTINUATION_LOCATION/$IN_FILENAME $INIT_FILENAME
fi
if [[ -f $INIT_FILENAME ]]; then
echo "CONTINUATION: Copy successful for file of type $filetype ($IN_FILENAME -> $INIT_FILENAME)"
else
echo "CONTINUATION: Could not copy file of type $filetype ($IN_FILENAME -> $INIT_FILENAME)"
fi
#Rscript $FLEPI_PATH/flepimop/main_scripts/seir_init_immuneladder.R --res_config config_SMH_R17_noBoo_lowIE_phase2_blk2.yml
#Rscript $FLEPI_PATH/preprocessing/seir_init_immuneladder_r17phase3_preOm.R --res_config config_SMH_R17_noBoo_lowIE_phase2_blk2.yml
fi

ls -ltr model_output
echo "***************** DONE FETCHING RESUME FILES *****************"

echo "***************** RUNNING INFERENCE_MAIN.R *****************"
Expand All @@ -100,20 +123,20 @@ echo "Rscript $FLEPI_PATH/flepimop/main_scripts/inference_slot.R --config $CONFI
--python python
--rpath Rscript
--is-resume $RESUME_RUN # Is this run a resume
--is-interactive FALSE # Is this run an interactive run" > $LOG_FILE 2>&1 &
--is-interactive FALSE # Is this run an interactive run" #> $LOG_FILE 2>&1 &

Rscript $FLEPI_PATH/flepimop/main_scripts/inference_slot.R -p $FLEPI_PATH --config $CONFIG_PATH --run_id $FLEPI_RUN_INDEX --npi_scenarios $FLEPI_NPI_SCENARIOS --outcome_scenarios $FLEPI_OUTCOME_SCENARIOS --jobs 1 --iterations_per_slot $FLEPI_ITERATIONS_PER_SLOT --this_slot $FLEPI_SLOT_INDEX --this_block 1 --stoch_traj_flag $FLEPI_STOCHASTIC_RUN --is-resume $RESUME_RUN --is-interactive FALSE > $LOG_FILE 2>&1
Rscript $FLEPI_PATH/flepimop/main_scripts/inference_slot.R -p $FLEPI_PATH --config $CONFIG_PATH --run_id $FLEPI_RUN_INDEX --npi_scenarios $FLEPI_NPI_SCENARIOS --outcome_scenarios $FLEPI_OUTCOME_SCENARIOS --jobs 1 --iterations_per_slot $FLEPI_ITERATIONS_PER_SLOT --this_slot $FLEPI_SLOT_INDEX --this_block 1 --stoch_traj_flag $FLEPI_STOCHASTIC_RUN --is-resume $RESUME_RUN --is-interactive FALSE #> $LOG_FILE 2>&1
dvc_ret=$?
if [ $dvc_ret -ne 0 ]; then
if [[ $dvc_ret -ne 0 ]]; then
echo "Error code returned from inference_slot.R: $dvc_ret"
fi
echo "***************** DONE RUNNING INFERENCE_SLOT.R *****************"


echo "***************** UPLOADING RESULT TO S3 (OR NOT) *****************"
## copy to s3 if necessary:
if [ $S3_UPLOAD == "true" ]; then
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar"
if [[ $S3_UPLOAD == "true" ]]; then
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "init"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/chimeric/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX,'$type','parquet'))")
aws s3 cp --quiet $FILENAME $S3_RESULTS_PATH/$FILENAME
Expand All @@ -128,12 +151,12 @@ if [ $S3_UPLOAD == "true" ]; then
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX,'$type','csv'))")
aws s3 cp --quiet $FILENAME $S3_RESULTS_PATH/$FILENAME
done
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof" "init"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX,'$type','parquet'))")
aws s3 cp --quiet $FILENAME $S3_RESULTS_PATH/$FILENAME
done
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof" "init"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/final/', $FLEPI_SLOT_INDEX,'$type','parquet'))")
aws s3 cp --quiet $FILENAME $S3_RESULTS_PATH/$FILENAME
Expand All @@ -149,7 +172,7 @@ echo "***************** DONE UPLOADING RESULT TO S3 (OR NOT) *****************"

# TODO: MV here ? what to do about integration_dump.pkl e.g ?
echo "***************** COPYING RESULTS TO RESULT DIRECTORY *****************"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "init"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/chimeric/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX,'$type','parquet'))")
export $OUT_FILENAME_DIR="$(dirname "${FS_RESULTS_PATH}/${FILENAME}")"
Expand All @@ -170,14 +193,14 @@ do
mkdir -p $OUT_FILENAME_DIR
cp --parents $FILENAME $FS_RESULTS_PATH
done
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof" "init"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX,'$type','parquet'))")
export $OUT_FILENAME_DIR="$(dirname "${FS_RESULTS_PATH}/${FILENAME}")"
mkdir -p $OUT_FILENAME_DIR
cp --parents $FILENAME $FS_RESULTS_PATH
done
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof" "init"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/final/', $FLEPI_SLOT_INDEX,'$type','parquet'))")
export $OUT_FILENAME_DIR="$(dirname "${FS_RESULTS_PATH}/${FILENAME}")"
Expand All @@ -204,5 +227,6 @@ echo "DONE EVERYTHING."
# --> THIS DOES NOT WORK
#mv slurm-$SLURM_ARRAY_JOB_ID_${SLURM_ARRAY_TASK_ID}.out $FS_RESULTS_PATH/slurm-$SLURM_ARRAY_JOB_ID_${SLURM_ARRAY_TASK_ID}.out

unsetopt shwordsplit

wait
8 changes: 4 additions & 4 deletions batch/SLURM_inference_runner.sh
Original file line number Diff line number Diff line change
Expand Up @@ -128,12 +128,12 @@ if [ $S3_UPLOAD == "true" ]; then
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX,'$type','csv'))")
aws s3 cp --quiet $FILENAME $S3_RESULTS_PATH/$FILENAME
done
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX,'$type','parquet'))")
aws s3 cp --quiet $FILENAME $S3_RESULTS_PATH/$FILENAME
done
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/final/', $FLEPI_SLOT_INDEX,'$type','parquet'))")
aws s3 cp --quiet $FILENAME $S3_RESULTS_PATH/$FILENAME
Expand Down Expand Up @@ -170,14 +170,14 @@ do
mkdir -p $OUT_FILENAME_DIR
cp --parents $FILENAME $FS_RESULTS_PATH
done
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/intermediate/%09d.'% $FLEPI_SLOT_INDEX,$FLEPI_BLOCK_INDEX,'$type','parquet'))")
export $OUT_FILENAME_DIR="$(dirname "${FS_RESULTS_PATH}/${FILENAME}")"
mkdir -p $OUT_FILENAME_DIR
cp --parents $FILENAME $FS_RESULTS_PATH
done
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar"
for type in "seir" "hosp" "llik" "spar" "snpi" "hnpi" "hpar" "memprof"
do
export FILENAME=$(python -c "from gempyor import file_paths; print(file_paths.create_file_name('$FLEPI_RUN_INDEX','$FLEPI_PREFIX/$FLEPI_RUN_INDEX/global/final/', $FLEPI_SLOT_INDEX,'$type','parquet'))")
export $OUT_FILENAME_DIR="$(dirname "${FS_RESULTS_PATH}/${FILENAME}")"
Expand Down
10 changes: 9 additions & 1 deletion batch/SLURM_postprocess_runner.run
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,17 @@ conda activate flepimop-env
which python
which Rscript

# aws cli to export plots (location according to instruction)
export PATH=~/aws-cli/bin:$PATH

# move all the slurm logs into the right folder:
mv slurm-$SLURM_ARRAY_JOB_ID_${SLURM_ARRAY_TASK_ID}.out $FS_RESULTS_PATH/slurm-$SLURM_ARRAY_JOB_ID_${SLURM_ARRAY_TASK_ID}.out

curl \
-H "Title: $FLEPI_RUN_INDEX Done ✅" \
-H "Priority: urgent" \
-H "Tags: warning,snail" \
-d "TODO say how many failure and stuff" \
-d "Hopefully the results look alright" \
ntfy.sh/flepimop_alerts

# get the slack credentials
Expand All @@ -37,3 +40,8 @@ mkdir pplot

source $FLEPI_PATH/batch/postprocessing-scripts.sh

cp -R pplot $FS_RESULTS_PATH
if [[ $S3_UPLOAD == "true" ]]; then
aws s3 cp --quiet pplot $S3_RESULTS_PATH/pplot
fi

Loading