diff --git a/docs/Users_Guide/aggregation.rst b/docs/Users_Guide/aggregation.rst new file mode 100644 index 00000000..ea847e42 --- /dev/null +++ b/docs/Users_Guide/aggregation.rst @@ -0,0 +1,198 @@ +*********** +Aggregation +*********** + +Aggregation is an option that can be applied to MET stat output (in +the appropriate format) to calculate aggregation statistics and confidence intervals. +Input data must first be reformatted using the METdataio METreformat module to +label all the columns with the corresponding statistic name specified in the +`MET User's Guide `_ +for `point-stat `_, +`grid-stat `_, or +`ensemble-stat `_ .stat output data. + +Python Requirements +=================== + +The third-party Python packages and the corresponding version numbers are found +in the requirements.txt and nco_requirements.txt files: + +**For Non-NCO systems**: + +* `requirements.txt `_ + +**For NCO systems**: + +* `nco_requirements.txt `_ + + +Retrieve Code +============= + +Refer to the `Installation Guide `_ +for instructions. + + +Retrieve Sample Data +==================== + +The sample data used for this example is located in the $METCALCPY_BASE/test directory, +where **$METCALCPY_BASE** is the full path to the location of the METcalcpy source code +(e.g. /User/my_dir/METcalcpy). +The example data file used for this example is **rrfs_ecnt_for_agg.data**. +This data was reformatted from the MET .stat output using the METdataio METreformat module. +The reformatting step labels the columns with the corresponding statistics, based on the MET tool (point-stat, +grid-stat, or ensemble-stat). The ECNT linetype of +the MET grid-stat output has been reformatted to include the statistics names for all +`ECNT `_ specific columns. + + +Input data **must** be in this format prior to using the aggregation +module, agg_stat.py. + +The example data can be copied to a working directory, or left in this directory. The location +of the data will be specified in the YAML configuration file. + +Please refer to the METdataio User's Guide for instructions for reformatting MET .stat files : +https://metdataio.readthedocs.io/en/develop/Users_Guide/reformat_stat_data.html + + +Aggregation +=========== + +The agg_stat module, **agg_stat.py** to is used to calculate aggregated statistics and confidence intervals. +This module can be run as a script at the command-line, or imported in another Python script. + +A required YAML configuration file, **config_agg_stat.yaml** file is used to define the location of +input data and the name and location of the output file. + +The agg_stat module support the ECNT linetype that are output from the MET +**ensemble-stat** tool + +The input to the agg_stat module must have the appropriate format. The ECNT linetype must first be +`reformatted via the METdataio METreformat module `_ +by following the instructions under the **Reformatting for computing aggregation statistics with METcalcpy agg_stat** +header. + +Modify the YAML configuration file +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +The config_agg_stat.yaml is required to perform aggregation statistics calculations. This +configuration file is located in the $METCALCPY_BASE/metcalcpy/pre_processing/aggregation/config +directory. The $METCALCPY_BASE is the directory where the METcalcpy source code is +saved (e.g. /Users/my_acct/METcalcpy). Change directory to $METCALCPY_BASE/metcalcpy/pre_processing/aggregation/config +and modify the config_agg_stat.yaml file. + +1. Specify the input and output files + +.. code-block:: yaml + + agg_stat_input: /path-to/test/data/rrfs_ecnt_for_agg.data + agg_stat_output: /path-to/ecnt_aggregated.data + +Replace the *path-to* in the above two settings to the location where the input data +was stored (either in a working directory or the $METCALCPY_BASE/test directory). **NOTE**: +Use the **full path** to the input and output directories (no environment variables). + +2. Specify the meteorological and the stat variables: + +.. code-block:: yaml + + fcst_var_val_1: + TMP: + - ECNT_RMSE + - ECNT_SPREAD_PLUS_OERR + +3. Specify the selected models/members: + +.. code-block:: yaml + + series_val_1: + model: + - RRFS_GEFS_GF.SPP.SPPT + +4. Specify the selected statistics to be aggregated, in this case, the RMSE and SPREAD_PLUS_OERR + statistics from the ECNT ensemble-stat tool output are to be calculated. The aggregated statistics + are named ECNT_RMSE and ECNT_SPREAD_PLUS_OERR (append original statistic name with the linetype): + + list_stat_1: + - ECNT_RMSE + - ECNT_SPREAD_PLUS_OERR + +The full **config_agg_stat.yaml** file is shown below: + + +.. literalinclude:: ../../metcalcpy/pre_processing/aggregation/config/config_agg_stat.yaml + + + +**NOTE**: Use full directory paths when specifying the location of the input file and output +file. + + +Set the Environment and PYTHONPATH +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +bash shell: + +.. code-block:: ini + + export METCALCPY_BASE=/path-to-METcalcpy + +csh shell: + +.. code-block:: ini + + setenv METCALCPY_BASE /path-to-METcalcpy + + +where *path-to-METcalcpy* is the full path to where the METcalcpy source code is located +(e.g. /User/my_dir/METcalcpy) + +bash shell: + +.. code-block:: ini + + export PYTHONPATH=$METCALCPY_BASE/:$METCALCPY_BASE/metcalcpy + +csh shell + +.. code-block:: ini + + setenv PYTHONPATH $METCALCPY_BASE/:$METCALCPY_BASE/metcalcpy + + +Where $METCALCPY_BASE is the full path to where the METcalcpy code resides (e.g. /User/ +my_dir/METcalcpy). + +Run the python script: +^^^^^^^^^^^^^^^^^^^^^^ + +The following are instructions for performing aggregation from the command-line: + +.. code-block:: yaml + + + python $METCALCPY_BASE/metcalcpy/agg_stat.py $METCALCPY_BASE/metcalcpy/pre_processing/aggregation/config/config_stat_agg.yaml + + +This will generate the file **ecnt_aggregated.data** (from the agg_stat_output setting) which now contains the +aggregated statistics data. + + +Additionally, the agg_stat.py module can be invoked by another script or module +by importing the package: + +.. code-block:: ini + + from metcalcpy.agg_stat import AggStat + + AGG_STAT = AggStat(PARAMS) + AGG_STAT.calculate_stats_and_ci() + +where PARAMS is a dictionary containing the parameters indicating the +location of input and output data. The structure is similar to the +original Rscript template from which this Python implementation was derived. + +**NOTE**: Remember to use the same PYTHONPATH defined above to ensure that the agg_stat module is found by +the Python import process. \ No newline at end of file diff --git a/docs/Users_Guide/index.rst b/docs/Users_Guide/index.rst index 4c723235..66a60f65 100644 --- a/docs/Users_Guide/index.rst +++ b/docs/Users_Guide/index.rst @@ -65,6 +65,8 @@ National Center for Atmospheric Research (NCAR) is sponsored by NSF. installation vertical_interpolation difficulty_index + aggregation + write_mpr release-notes **Indices and tables** diff --git a/docs/Users_Guide/write_mpr.rst b/docs/Users_Guide/write_mpr.rst new file mode 100644 index 00000000..e6c52bcc --- /dev/null +++ b/docs/Users_Guide/write_mpr.rst @@ -0,0 +1,98 @@ +********************** +Write MPR +********************** + +Description +=========== + +This program writes data to an output file in MET’s Matched Pair (MPR) format. It +takes several inputs, which are described in the list below. The script will compute +the observation input and total number of observations. It will also check to see if +the output directory is present and will create that directory if it does not exist. + +Example +======= + +Examples for how to use this script can be found in the driver scripts of the use cases +listed below. + +* `Stratosphere Polar `_ +* `Blocking `_ +* `Weather Regime `_ + +Information about Input Data +============================ + +At this time, all input arrays have to be one dimensional only and should be the same size. +The script does not make an attempt to check if input arrays are the same size. If any of +your input arrays are larger than the observation input array, the data will be chopped at +the length of the observation input. If an array is shorter than the observation input, the +program will error. + +Currently, the the following variables cannot be set and will be output as NA: FCST_THRESH, +OBS_THRESH, COV_THRESH, ALPHA, OBS_QC, CLIMO_MEAN, CLIMO_STDEV, CLIMO_CDF. Additionally the +following variables also cannot be set and have default values: INTERP_MTHD = NEAREST, +INTERP_PNTS = 1, and OBTYPE = ADPUPA. + + data_fcst: 1D array float + forecast data to write to MPR file + data_obs: 1D array float + observation data to write to MPR file + lats_in: 1D array float + data latitudes + lons_in: 1D array float + data longitudes + fcst_lead: 1D array string of format HHMMSS + forecast lead time + fcst_valid: 1D array string of format YYYYmmdd_HHMMSS + forecast valid time + obs_lead: 1D array string of format HHMMSS + observation lead time + obs_valid: 1D array string of format YYYYmmdd_HHMMSS + observation valid time + mod_name: string + output model name (the MODEL column in MET) + desc: string + output description (the DESC column in MET) + fcst_var: 1D array string + forecast variable name + fcst_unit: 1D array string + forecast variable units + fcst_lev: 1D array string + forecast variable level + obs_var: 1D array string + observation variable name + obs_unit: 1D array string + observation variable units + obs_lev: 1D array string + observation variable level + maskname: string + name of the verification masking region + obsslev: 1D array string + Pressure level of the observation in hPA or accumulation + interval in hours + outdir: string + Full path including where the output data should go + outfile_prefix: string + Prefix to use for the output filename. The time stamp will + be added in MET's format based off the first forecast time + + +Run from a python script +========================= + +* Make sure you have these required Python packages: + + * Python 3.7 + + * metcalcpy + + * numpy + + * os + +.. code-block:: ini + + write_mpr_file(data_fcst,data_obs,lats_in,lons_in,fcst_lead,fcst_valid,obs_lead,obs_valid,mod_name,desc,fcst_var,fcst_unit,fcst_lev,obs_var,obs_unit,obs_lev,maskname,obsslev,outdir,outfile_prefix) + +The output fill be a .stat file located in outdir with data in `MET's Matched Pair Format `_. The file will be labeled with outfile_prefix and then have lead time, valid YYYYMMDD, and valid HHMMSS stamped onto the file name. diff --git a/metcalcpy/contributed/blocking_weather_regime/Blocking_WeatherRegime_util.py b/metcalcpy/contributed/blocking_weather_regime/Blocking_WeatherRegime_util.py index 1c306c6e..ca494682 100644 --- a/metcalcpy/contributed/blocking_weather_regime/Blocking_WeatherRegime_util.py +++ b/metcalcpy/contributed/blocking_weather_regime/Blocking_WeatherRegime_util.py @@ -17,6 +17,12 @@ def parse_steps(): + """ + Function to parse the steps for the Blocking and weather regime Calculations + and then return them to the driver + :return: Lists containing the forecast and observation steps + :rtype: List of strings + """ steps_param_fcst = os.environ.get('FCST_STEPS','') steps_list_fcst = steps_param_fcst.split("+") @@ -27,54 +33,46 @@ def parse_steps(): return steps_list_fcst, steps_list_obs -def write_mpr_file(data_obs,data_fcst,lats_in,lons_in,time_obs,time_fcst,mname,desc,fvar,funit,flev,ovar,ounit,olev,maskname,obslev,outfile): - - dlength = len(lons_in) - bdims = data_obs.shape - - index_num = np.arange(0,dlength,1)+1 - - # Get the length of the model, FCST_VAR, FCST_LEV, OBS_VAR, OBS_LEV, VX_MASK - mname_len = str(max([5,len(mname)])+3) - desc_len = str(max([4,len(mname)])+1) - mask_len = str(max([7,len(maskname)])+3) - fvar_len = str(max([8,len(fvar)])+3) - funit_len = str(max([8,len(funit)])+3) - flev_len = str(max([8,len(flev)])+3) - ovar_len = str(max([7,len(ovar)])+3) - ounit_len = str(max([8,len(ounit)])+3) - olev_len = str(max([7,len(olev)])+3) - - format_string = '%-7s %-'+mname_len+'s %-'+desc_len+'s %-12s %-18s %-18s %-12s %-17s %-17s %-'+fvar_len+'s ' \ - '%-'+funit_len+'s %-'+flev_len+'s %-'+ovar_len+'s %-'+ounit_len+'s %-'+olev_len+'s %-10s %-'+mask_len+'s ' \ - '%-13s %-13s %-13s %-13s %-13s %-13s %-9s\n' - format_string2 = '%-7s %-'+mname_len+'s %-'+desc_len+'s %-12s %-18s %-18s %-12s %-17s %-17s %-'+fvar_len+'s ' \ - '%-'+funit_len+'s %-'+flev_len+'s %-'+ovar_len+'s %-'+ounit_len+'s %-'+olev_len+'s %-10s %-'+mask_len+'s ' \ - '%-13s %-13s %-13s %-13s %-13s %-13s %-9s %-10s %-10s %-10s %-12.4f %-12.4f %-10s %-10s %-12.4f %-12.4f ' \ - '%-10s %-10s %-10s %-10s\n' - - # Write the file - for y in range(bdims[0]): - for dd in range(bdims[1]): - if time_fcst['valid'][y][dd]: - ft_stamp = time_fcst['lead'][y][dd]+'L_'+time_fcst['valid'][y][dd][0:8]+'_' \ - +time_fcst['valid'][y][dd][9:15]+'V' - mpr_outfile_name = outfile+'_'+ft_stamp+'.stat' - with open(mpr_outfile_name, 'w') as mf: - mf.write(format_string % ('VERSION', 'MODEL', 'DESC', 'FCST_LEAD', 'FCST_VALID_BEG', 'FCST_VALID_END', - 'OBS_LEAD', 'OBS_VALID_BEG', 'OBS_VALID_END', 'FCST_VAR', 'FCST_UNITS', 'FCST_LEV', 'OBS_VAR', - 'OBS_UNITS', 'OBS_LEV', 'OBTYPE', 'VX_MASK', 'INTERP_MTHD', 'INTERP_PNTS', 'FCST_THRESH', - 'OBS_THRESH', 'COV_THRESH', 'ALPHA', 'LINE_TYPE')) - for dpt in range(dlength): - mf.write(format_string2 % ('V9.1',mname,desc,time_fcst['lead'][y][dd],time_fcst['valid'][y][dd], - time_fcst['valid'][y][dd],time_obs['lead'][y][dd],time_obs['valid'][y][dd], - time_obs['valid'][y][dd],fvar,funit,flev,ovar,ounit,olev,'ADPUPA',maskname, - 'NEAREST','1','NA','NA','NA','NA','MPR',str(dlength),str(index_num[dpt]),'NA', - lats_in[dpt],lons_in[dpt],obslev,'NA',data_fcst[y,dd,dpt],data_obs[y,dd,dpt],'NA','NA', - 'NA','NA')) +def get_filenames_list(filetxt): + """ + Function that opens a text file containing the listing of input files, remove the header + which may say file_list and then return them for reading + :param filetxt: + input filename that contains the listing of input files + :type filetxt: string + :return: List containing the names of the input files + :rtype: List of strings + """ + + # Read input file + with open(filetxt) as ft: + data_infiles = ft.read().splitlines() + # Remove the first line if it's there + if (data_infiles[0] == 'file_list'): + data_infiles = data_infiles[1:] + + return data_infiles def read_nc_met(infiles,invar,nseasons,dseasons): + """ + Function to read in MET version netCDF data specifically for the blocking and weather regime + calculations. The output array needs to be in a specific format. + :param infiles: + List of full paths to filenames of the data to read in + :param invar: + Variable name in the file of the data to read in + :param nseasons: + The number of years the input data contains + :param dseasons: + The number of days in each year (must be equal for all input years) + :type infiles: List of strings + :type invar: String + :type nseasons: Integer + :type dseasons: Integer + return: 4D array of data [year, day, lat, lon], latitude array, longitude array, and a time dictionary + rtype: numpy array, numpy array, numpy array, dictionary + """ print("Reading in Data") @@ -129,38 +127,3 @@ def read_nc_met(infiles,invar,nseasons,dseasons): time_dict = {'init':init_list_2d,'valid':valid_list_2d,'lead':lead_list_2d} return var_4d,lats,lons,time_dict - - -def reorder_fcst_regimes(kmeans_fcst,perc_fcst,wrc_fcst,wrnum_fcst,fcst_order): - - # Reorder the data so that the weather regime patterns match between - # the forecast and observations, is needed - kmeans_fcst_new = np.empty(kmeans_fcst.shape,dtype=float) - perc_fcst_new = np.empty(perc_fcst.shape,dtype=float) - wrc_fcst_new = np.empty(wrc_fcst.shape,dtype=float) - for wrn in np.arange(wrnum_fcst): - perc_fcst_new[wrn] = perc_fcst[fcst_order[wrn]-1] - kmeans_fcst_new[wrn,:,:] = kmeans_fcst[fcst_order[wrn]-1,:,:] - wrc_cur = np.where(wrc_fcst == fcst_order[wrn]) - wrc_fcst_new[wrc_cur] = wrn + 1 - - return kmeans_fcst_new,perc_fcst_new,wrc_fcst_new - - -def reorder_fcst_regimes_correlate(kmeans_obs,kmeans_fcst,perc_fcst,wrc_fcst,wrnum_fcst): - - # Correlate between the forecast and obs weather regimes to find the max correlation - # Use the max correlation in reordering the forecast to match the observations - matching_order = np.empty(wrnum_fcst,dtype=int) - for wrr in np.arange(wrnum_fcst): - corr_arr = np.empty(wrnum_fcst,dtype=float) - for owr in np.arange(wrnum_fcst): - curcorr = stats.pearsonr(kmeans_obs[wrr,:,:].flatten(),kmeans_fcst[owr,:,:].flatten()) - corr_arr[owr] = curcorr[0] - matching_order[wrr] = corr_arr.argmax()+1 - - # Call the reorder_fcst_regimes to reorder - matching_order = list(matching_order) - kmeans_fcst_new,perc_fcst_new,wrc_fcst_new = reorder_fcst_regimes(kmeans_fcst,perc_fcst,wrc_fcst,wrnum_fcst,matching_order) - - return kmeans_fcst_new,perc_fcst_new,wrc_fcst_new diff --git a/metcalcpy/contributed/blocking_weather_regime/WeatherRegime.py b/metcalcpy/contributed/blocking_weather_regime/WeatherRegime.py index cdab30f6..68c8f608 100644 --- a/metcalcpy/contributed/blocking_weather_regime/WeatherRegime.py +++ b/metcalcpy/contributed/blocking_weather_regime/WeatherRegime.py @@ -21,6 +21,41 @@ from eofs.standard import Eof +def reorder_fcst_regimes(kmeans_fcst,perc_fcst,wrc_fcst,wrnum_fcst,fcst_order): + + # Reorder the data so that the weather regime patterns match between + # the forecast and observations, is needed + kmeans_fcst_new = np.empty(kmeans_fcst.shape,dtype=float) + perc_fcst_new = np.empty(perc_fcst.shape,dtype=float) + wrc_fcst_new = np.empty(wrc_fcst.shape,dtype=float) + for wrn in np.arange(wrnum_fcst): + perc_fcst_new[wrn] = perc_fcst[fcst_order[wrn]-1] + kmeans_fcst_new[wrn,:,:] = kmeans_fcst[fcst_order[wrn]-1,:,:] + wrc_cur = np.where(wrc_fcst == fcst_order[wrn]) + wrc_fcst_new[wrc_cur] = wrn + 1 + + return kmeans_fcst_new,perc_fcst_new,wrc_fcst_new + + +def reorder_fcst_regimes_correlate(kmeans_obs,kmeans_fcst,perc_fcst,wrc_fcst,wrnum_fcst): + + # Correlate between the forecast and obs weather regimes to find the max correlation + # Use the max correlation in reordering the forecast to match the observations + matching_order = np.empty(wrnum_fcst,dtype=int) + for wrr in np.arange(wrnum_fcst): + corr_arr = np.empty(wrnum_fcst,dtype=float) + for owr in np.arange(wrnum_fcst): + curcorr = stats.pearsonr(kmeans_obs[wrr,:,:].flatten(),kmeans_fcst[owr,:,:].flatten()) + corr_arr[owr] = curcorr[0] + matching_order[wrr] = corr_arr.argmax()+1 + + # Call the reorder_fcst_regimes to reorder + matching_order = list(matching_order) + kmeans_fcst_new,perc_fcst_new,wrc_fcst_new = reorder_fcst_regimes(kmeans_fcst,perc_fcst,wrc_fcst,wrnum_fcst,matching_order) + + return kmeans_fcst_new,perc_fcst_new,wrc_fcst_new + + class WeatherRegimeCalculation(): """Contains the programs to perform a Weather Regime Analysis """ diff --git a/metcalcpy/util/write_mpr.py b/metcalcpy/util/write_mpr.py new file mode 100644 index 00000000..a515c40d --- /dev/null +++ b/metcalcpy/util/write_mpr.py @@ -0,0 +1,122 @@ +# ============================* + # ** Copyright UCAR (c) 2020 + # ** University Corporation for Atmospheric Research (UCAR) + # ** National Center for Atmospheric Research (NCAR) + # ** Research Applications Lab (RAL) + # ** P.O.Box 3000, Boulder, Colorado, 80307-3000, USA + # ============================* +import os +import numpy as np + + +def write_mpr_file(data_fcst,data_obs,lats_in,lons_in,fcst_lead,fcst_valid,obs_lead,obs_valid,mod_name,desc,fcst_var,fcst_unit,fcst_lev,obs_var,obs_unit,obs_lev,maskname,obsslev,outdir,outfile_prefix): + + """ + Function to write an output mpr file given a 1d array of observation and forecast data + Parameters: + ---------- + data_fcst: 1D array float + forecast data to write to MPR file + data_obs: 1D array float + observation data to write to MPR file + lats_in: 1D array float + data latitudes + lons_in: 1D array float + data longitudes + fcst_lead: 1D array string of format HHMMSS + forecast lead time + fcst_valid: 1D array string of format YYYYmmdd_HHMMSS + forecast valid time + obs_lead: 1D array string of format HHMMSS + observation lead time + obs_valid: 1D array string of format YYYYmmdd_HHMMSS + observation valid time + mod_name: string + output model name (the MODEL column in MET) + desc: string + output description (the DESC column in MET) + fcst_var: 1D array string + forecast variable name + fcst_unit: 1D array string + forecast variable units + fcst_lev: 1D array string + forecast variable level + obs_var: 1D array string + observation variable name + obs_unit: 1D array string + observation variable units + obs_lev: 1D array string + observation variable level + maskname: string + name of the verification masking region + obsslev: 1D array string + Pressure level of the observation in hPA or accumulation + interval in hours + outdir: string + Full path including where the output data should go + outfile_prefix: string + Prefix to use for the output filename. The time stamp will + be added in MET's format based off the first forecast time + """ + + """ + Get the data length to create the INDEX and TOTAL variables in the MPR line + """ + dlength = len(data_obs) + index_num = np.arange(0,dlength,1)+1 + + """ + Get the length of the model, FCST_VAR, FCST_LEV, OBS_VAR, OBS_LEV, VX_MASK, etc for formatting + """ + mname_len = str(max([5,len(mod_name)])+3) + desc_len = str(max([4,len(desc)])+3) + mask_len = str(max([7,len(maskname)])+3) + fvar_len = str(max([8,max([len(l) for l in fcst_var])])+3) + funit_len = str(max([8,max([len(l) for l in fcst_unit])])+3) + flev_len = str(max([8,max([len(l) for l in fcst_lev])])+3) + ovar_len = str(max([7,max([len(l) for l in obs_var])])+3) + ounit_len = str(max([8,max([len(l) for l in obs_unit])])+3) + olev_len = str(max([7,max([len(l) for l in obs_lev])])+3) + + """ + Set up format strings for the header (format_string) and data (format_string2) + """ + format_string = '%-7s %-'+mname_len+'s %-'+desc_len+'s %-12s %-18s %-18s %-12s %-17s %-17s %-'+fvar_len+'s ' \ + '%-'+funit_len+'s %-'+flev_len+'s %-'+ovar_len+'s %-'+ounit_len+'s %-'+olev_len+'s %-10s %-'+mask_len+'s ' \ + '%-13s %-13s %-13s %-13s %-13s %-13s %-9s\n' + format_string2 = '%-7s %-'+mname_len+'s %-'+desc_len+'s %-12s %-18s %-18s %-12s %-17s %-17s %-'+fvar_len+'s ' \ + '%-'+funit_len+'s %-'+flev_len+'s %-'+ovar_len+'s %-'+ounit_len+'s %-'+olev_len+'s %-10s %-'+mask_len+'s ' \ + '%-13s %-13s %-13s %-13s %-13s %-13s %-9s %-10s %-10s %-10s %-12.4f %-12.4f %-10s %-10s %-12.4f %-12.4f ' \ + '%-10s %-10s %-10s %-10s\n' + + """ + Create the output directory if it doesn't exist + """ + if not os.path.exists(outdir): + os.makedirs(outdir) + + """ + Put the timestamp on the output file + """ + fcst_valid_str = fcst_valid[0] + ft_stamp = fcst_lead[0]+'L_'+fcst_valid_str[0:8]+'_'+fcst_valid_str[9:15]+'V' + full_outfile = os.path.join(outdir,outfile_prefix+'_'+ft_stamp+'.stat') + + """ + Write the file + """ + #print('Writing output MPR file: '+full_outfile) + with open(full_outfile, 'w') as mf: + # Write the header + mf.write(format_string % ('VERSION', 'MODEL', 'DESC', 'FCST_LEAD', 'FCST_VALID_BEG', 'FCST_VALID_END', + 'OBS_LEAD', 'OBS_VALID_BEG', 'OBS_VALID_END', 'FCST_VAR', 'FCST_UNITS', 'FCST_LEV', 'OBS_VAR', + 'OBS_UNITS', 'OBS_LEV', 'OBTYPE', 'VX_MASK', 'INTERP_MTHD', 'INTERP_PNTS', 'FCST_THRESH', + 'OBS_THRESH', 'COV_THRESH', 'ALPHA', 'LINE_TYPE')) + for dpt in range(dlength): + # Write the data + mf.write(format_string2 % ('V9.1',mod_name,desc,fcst_lead[dpt],fcst_valid[dpt],fcst_valid[dpt], + obs_lead[dpt],obs_valid[dpt],obs_valid[dpt],fcst_var[dpt],fcst_unit[dpt],fcst_lev[dpt], + obs_var[dpt],obs_unit[dpt],obs_lev[dpt],'ADPUPA',maskname,'NEAREST','1','NA','NA','NA','NA','MPR', + str(dlength),str(index_num[dpt]),'NA',lats_in[dpt],lons_in[dpt],obsslev[dpt],'NA',data_fcst[dpt], + data_obs[dpt],'NA','NA','NA','NA')) +