Install markdown reader browser extension, enable access to urls, and open this file to view it with a nicer rendering.
HYFAA is a python scheduler for operational hydrological forecasting. To achieve this, it combines :
- a hydrological simulation model : MGB-IPH
- retrieval routines that gather the necessary forcing and assimilation data into organised databases
- a main operational scheduler that configures MGB-IPH simulations and handles assimilation routines
- a post-processing routine to make easy-to-use files containing main variables
HYFAA is composed of :
- python HYFAA code provided in the src/scheduler_python folder
- fortran MGB-IPH code provided in the src/MGB-IPH folder called by the python HYFAA code
Hardware requirements :
- CPU : no minimum requirements (at least 1 ^^)
- RAM : 8GB minimum, or more for large ensemble calculations (post-processing step requires it)
- Disk space : it is recommended to have 1GB + 1MB * (ensemble_size * number_of_days) i.e. ~ 185GB space for a 10 year simulation with an ensemble size of 50
Make docker image with make_docker.sh
apt-get install build-essential gfortran cmake libnetcdf-dev libnetcdff-dev
pip install numpy numba scipy netCDF4 pyyaml progress pandas geopandas pytest requests SALib ftputil
- Run
./install.sh
NB:
- This will store
hyfaa
python modules in your default python site-package which is already in your import paths, so no action required. mgb_iph
script will be stored in/usr/local/bin
(requires root priviledges) which should be in you $PATH, so no action should be necessary.
Use for instance if you do not have root priviledges on your machine
- Run
./install.sh ${mgb_iph_install_dir}
,${mgb_iph_install_dir}
being any directory (must not exist prior to installation). - You will need to add paths to $PATH and $PYTHONPATH to directories within
${mgb_iph_install_dir}
that containhyfaa
andmgb_iph
executables:
export PATH=${mgb_iph_install_dir}:${mgb_iph_install_dir}/bin:$PATH
export PYTHONPATH=$(find ${mgb_iph_install_dir} -type d -iname 'site-packages'):$PYTHONPATH
NB:
- The export commands necessary will be shown at the end of the install script.
- To avoid entering those lines everytime you open a new terminal, simply add them to you ~/.bashrc
This is the preferred method as it installs an independant python environment and therefore allows more flexibility to add other librairies in the future and prevents problems with changes on HAL python modules.
module purge
to avoid module conflictsmodule load cmake netcdf/4.4.1 conda
conda create -n hyfaa_env python=3.7
conda activate hyfaa_env
- Follow Local install from linux PC without docker installation from step 2
This solely relies on HAL modules for python environment, rendering installation easier but changes on HAL python module may cause installation to fail in the future, and SALib library is not available.
module purge
to avoid module conflictsmodule load cmake netcdf/4.4.1 python
- Follow Local install from linux PC without docker installation from step 3
Use docker (best option), or use a unix virtual machine.
- choose a configuration folder in
work_configurations
- follow the
README.md
inside of the configuration folder to download input_data folder (hydrological static data configuration) and initializeddatabases
(may be optionnal)
- edit
run_docker.sh
to- mount the configuration folder chosen as
/work
, - adjust the hydroweb credentials
and launch
run_docker.sh
- mount the configuration folder chosen as
- go to the configuration folder chosen and launch
run.sh
- go to the configuration folder chosen and launch
./run_pbs.py
WARNING: So that modules and paths, pythonpaths are set on the node, you must either add them to your ~/.bashrc, or to the run.sh
script in the configuration folder.
NB: You can use the --pbs_name
option to set your job name; hyfaa
by default.