Skip to content

Prepare initial conditions

Kate Friedman edited this page Jun 17, 2021 · 36 revisions

Sections:


There are two types of initial conditions for the global-workflow:

  1. Warm start: these ICs are taken directly from either the GFS in production or an experiment "warmed" up (at least one cycle in).
  2. Cold start: any ICs converted to a new resolution or grid (e.g. GSM-GFS -> FV3GFS). These ICs are often prepared by chgres_cube (change resolution utility).

Most users will initiate their experiments with cold start ICs unless running high resolution (C768 deterministic with C384 EnKF) for a date with warm starts available. It is not recommended to run high resolution unless required or as part of final testing.

Resolutions:

  • C48 = 2­ degree ≈ 200km
  • C96 = 1­ degree ≈ 100km
  • C192 = 1/2­ degree ≈ 50km
  • C384 = 1/4 degree ≈ 25km
  • C768 = 1/8th degree ≈ 13km
  • C1152 ≈ 9km
  • C3072 ≈ 3km

Supported resolutions in global-workflow: C48, C96, C192, C384, C768

Automated Generation

Cycled mode

Not yet supported. See Manual Generation section below for how to create your ICs yourself (outside of workflow).

Free-forecast mode

Free-forecast mode in global-workflow includes getic and init jobs for the gfs suite. The getic job pulls inputs for chgres_cube (init job) or warm start ICs into your ROTDIR/COMROT. The init job then ingests those files to produce initial conditions for your experiment.

Users on machines without HPSS access (e.g. Orion) need to perform the getic step manually and stage inputs for the init job. The table below lists the needed files for init and where to place them in your ROTDIR.

Note for table: yy=year; mm=month; dd=day; hh=cycle

Operations/production output location on HPSS: /NCEPPROD/hpssprod/runhistory/rh${yy}/${yy}${mm}/${yy}${mm}${dd}/

Source Files Tarball name Where in ROTDIR
v12 ops gfs.t${hh}z.sanl
gfs.t${hh}z.sfcanl
com_gfs_prod_gfs.${yy}${mm}${dd}${hh}.anl.tar gfs.${yy}${mm}${dd}/${hh}
v13 ops gfs.t${hh}z.sanl
gfs.t${hh}z.sfcanl
com2_gfs_prod_gfs.${yy}${mm}${dd}${hh}.anl.tar gfs.${yy}${mm}${dd}/${hh}
v14 ops gfs.t${hh}z.atmanl.nemsio
gfs.t${hh}z.sfcanl.nemsio
gpfs_hps_nco_ops_com_gfs_prod_gfs.${yy}${mm}${dd}${hh}.anl.tar gfs.${yy}${mm}${dd}/${hh}
v15 ops pre-2020022600 gfs.t${hh}z.atmanl.nemsio
gfs.t${hh}z.sfcanl.nemsio
gpfs_dell1_nco_ops_com_gfs_prod_gfs.${yy}${mm}${dd}_${hh}.gfs_nemsioa.tar gfs.${yy}${mm}${dd}/${hh}
v15 ops gfs.t${hh}z.atmanl.nemsio
gfs.t${hh}z.sfcanl.nemsio
com_gfs_prod_gfs.${yy}${mm}${dd}_${hh}.gfs_nemsioa.tar gfs.${yy}${mm}${dd}/${hh}
v16 ops gfs.t${hh}z.atmanl.nc
gfs.t${hh}z.sfcanl.nc
com_gfs_prod_gfs.${yy}${mm}${dd}_${hh}.gfs_nca.tar gfs.${yy}${mm}${dd}/${hh}/atmos

Manual Generation

Cold starts

The following information is for users needing to generate initial conditions for a cycled experiment that will run at a different resolution or layer amount than the operational GFS (C768C384L64).

The new chgres_cube code is available from the UFS_UTILS repository on GitHub (maintained by George Gayno) and can be used to convert GFS ICs to a different resolution or number of layers. The chgres_cube code/scripts currently support the following GFS inputs:

  • pre-GFSv14 (GFS-GSM)
  • GFSv14 (GFS-GSM)
  • GFSv15 (FV3GFS)

Clone UFS_UTILS:

git clone --recursive https://github.com/NOAA-EMC/UFS_UTILS.git

Build UFS_UTILS:

sh build_all.sh
cd fix
sh link_fixdirs.sh emc $MACHINE

...where $MACHINE is "cray", "dell", "hera", or "jet".

Configure your conversion:

cd util/gdas_init
vi config

Read the doc block at the top of the config and adjust the variables to meet you needs (e.g. yy, mm, dd, hh for SDATE).

Submit conversion script:

./driver.$MACHINE.sh

...where $MACHINE is currently "dell" or "cray" or "hera". Additional options will be available as support for other machines expands.

90 small jobs will be submitted:

  • 9 jobs to pull inputs off HPSS (1 for deterministic and 8 for the EnKF ensemble members)
  • 81 jobs to run chgres (1 for deterministic/hires and 80 for each EnKF ensemble member)

The chgres jobs will have a dependency on the data-pull jobs and will wait to run until all data-pull jobs have completed.

Check output:

In the config you will have defined an output folder called $OUTDIR. The converted/chgres'd output will be found there, including the needed abias and radstat initial condition files. The files will be in the needed directory structure for the global-workflow system, therefore a user can move the contents of their $OUTDIR directly into their $ROTDIR/$COMROT.

Report bugs:

This is a preliminary version of the new chgres_cube code/scripts. Please report bugs to George Gayno (george.gayno@noaa.gov) and Kate Friedman (kate.friedman@noaa.gov).

Warm starts (from production)

The GFSv15 was implemented into production on June 12th, 2019 at 12z. The GFS was spun up ahead of that cycle and thus production output for the system is available from the 00z cycle (2019061200) and later. Production output tarballs from the prior GFSv14 system are located in the same location on HPSS but have "hps" in the name to represent that it was run on the Cray, where as the GFS now runs in production on the Dell and has "dell1" in the tarball name.

See production output in the following location on HPSS:

/NCEPPROD/hpssprod/runhistory/rhYYYY/YYYYMM/YYYYMMDD

Example location:

/NCEPPROD/hpssprod/runhistory/rh2019/201907/20190704

Example listing for 2019070400 production tarballs:

[Kate.Friedman@m72a2 ~]$ hpsstar dir /NCEPPROD/hpssprod/runhistory/rh2019/201907/20190704 | grep gfs | grep 20190704_00
[connecting to hpsscore1.fairmont.rdhpcs.noaa.gov/1217]
******************************************************************
*   Welcome to the NESCC High Performance Storage System         *
*                                                                *
*   Current HPSS version: 7.4.3 Patch 2                          *
*                                                                *
*                                                                *
*           Please Submit Helpdesk Request to                    *
*              rdhpcs.hpss.help@noaa.gov                         *
*                                                                *
*  Announcements:                                                *
******************************************************************
Username: Kate.Friedman  UID: 2391  Acct: 2391(2391) Copies: 1 Firewall: off [hsi.5.0.2.p5 Thu Apr 26 13:19:38 UTC 2018]
/NCEPPROD/hpssprod/runhistory/rh2019/201907:
drwxr-xr-x    2 nwprod    prod           12800 Jul 10 07:39 20190704
[connecting to hpsscore1.fairmont.rdhpcs.noaa.gov/1217]
-rw-r-----    1 nwprod    rstprod  24201632768 Jul  6 10:39 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas.tar
-rw-r--r--    1 nwprod    prod           11040 Jul  6 10:39 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas.tar.idx
-rw-r-----    1 nwprod    rstprod  104316883456 Jul  6 15:20 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp1.tar
-rw-r--r--    1 nwprod    prod          246560 Jul  6 15:20 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp1.tar.idx
-rw-r-----    1 nwprod    rstprod  104316883456 Jul  6 15:39 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp2.tar
-rw-r--r--    1 nwprod    prod          246560 Jul  6 15:39 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp2.tar.idx
-rw-r-----    1 nwprod    rstprod  104316883456 Jul  6 15:57 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp3.tar
-rw-r--r--    1 nwprod    prod          246560 Jul  6 15:57 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp3.tar.idx
-rw-r-----    1 nwprod    rstprod  104316883456 Jul  6 16:17 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp4.tar
-rw-r--r--    1 nwprod    prod          246560 Jul  6 16:17 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp4.tar.idx
-rw-r-----    1 nwprod    rstprod  104316883456 Jul  6 16:38 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp5.tar
-rw-r--r--    1 nwprod    prod          246560 Jul  6 16:38 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp5.tar.idx
-rw-r-----    1 nwprod    rstprod  104316883456 Jul  6 16:58 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp6.tar
-rw-r--r--    1 nwprod    prod          246560 Jul  6 16:58 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp6.tar.idx
-rw-r-----    1 nwprod    rstprod  104316883456 Jul  6 17:17 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp7.tar
-rw-r--r--    1 nwprod    prod          246560 Jul  6 17:17 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp7.tar.idx
-rw-r-----    1 nwprod    rstprod  104316883456 Jul  6 17:36 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp8.tar
-rw-r--r--    1 nwprod    prod          246560 Jul  6 17:36 gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190704_00.enkfgdas_restart_grp8.tar.idx
-rw-r-----    1 nwprod    rstprod   8213389824 Jul  6 04:57 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas.tar
-rw-r--r--    1 nwprod    prod          305440 Jul  6 04:57 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas.tar.idx
-rw-r--r--    1 nwprod    prod       760274432 Jul  6 04:57 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas_flux.tar
-rw-r--r--    1 nwprod    prod            4896 Jul  6 04:57 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas_flux.tar.idx
-rw-r--r--    1 nwprod    prod     95334748160 Jul  6 05:22 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas_nemsio.tar
-rw-r--r--    1 nwprod    prod            8480 Jul  6 05:22 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas_nemsio.tar.idx
-rw-r--r--    1 nwprod    prod      3623646720 Jul  6 04:57 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas_pgrb2.tar
-rw-r--r--    1 nwprod    prod           31520 Jul  6 04:57 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas_pgrb2.tar.idx
-rw-r-----    1 nwprod    rstprod  40406691840 Jul  6 05:04 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas_restart.tar
-rw-r--r--    1 nwprod    prod           26400 Jul  6 05:04 gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190704_00.gdas_restart.tar.idx
-rw-r-----    1 nwprod    rstprod  21489377280 Jul  6 05:26 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs.tar
-rw-r--r--    1 nwprod    prod         2031392 Jul  6 05:26 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs.tar.idx
-rw-r--r--    1 nwprod    prod     46592740864 Jul  6 05:34 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_flux.tar
-rw-r--r--    1 nwprod    prod          214816 Jul  6 05:34 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_flux.tar.idx
-rw-r--r--    1 nwprod    prod     294403269120 Jul  6 07:01 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_nemsioa.tar
-rw-r--r--    1 nwprod    prod           23328 Jul  6 07:01 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_nemsioa.tar.idx
-rw-r--r--    1 nwprod    prod     336908471296 Jul  6 08:05 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_nemsiob.tar
-rw-r--r--    1 nwprod    prod           26912 Jul  6 08:05 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_nemsiob.tar.idx
-rw-r--r--    1 nwprod    prod     63337960960 Jul  6 05:44 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_pgrb2.tar
-rw-r--r--    1 nwprod    prod          400672 Jul  6 05:44 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_pgrb2.tar.idx
-rw-r--r--    1 nwprod    prod     43709473792 Jul  6 05:52 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_pgrb2b.tar
-rw-r--r--    1 nwprod    prod          400160 Jul  6 05:52 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_pgrb2b.tar.idx
-rw-r--r--    1 nwprod    prod     12637940736 Jul  6 05:55 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_restart.tar
-rw-r--r--    1 nwprod    prod            5408 Jul  6 05:55 gpfs_dell1_nco_ops_com_gfs_prod_gfs.20190704_00.gfs_restart.tar.idx

The warm starts and other output from production are at C768 deterministic and C384 EnKF. The warm start files must be converted to your desired resolution(s) using global_chgres if you wish to run a different resolution. If you are running a C768/C384 experiment you can use them as is.

What files should you pull for starting a new experiment with warm starts from production?

That depends on what mode you want to run...free-forecast or cycled. Whichever mode navigate to the top of your COMROT and pull the entirety of the tarball(s) listed below for your mode. The files within the tarball are already in the $CDUMP.$PDY/$CYC folder format expected by the system.

free-forecast

Two tarballs to pull:

File #1 (for starting cycle SDATE):

/NCEPPROD/hpssprod/runhistory/rhYYYY/YYYYMM/YYYYMMDD/gpfs_dell1_nco_ops_com_gfs_prod_gfs.YYYYMMDD_CC.gfs_restart.tar

File #2 (for prior cycle GDATE=SDATE-06):

/NCEPPROD/hpssprod/runhistory/rhYYYY/YYYYMM/YYYYMMDD/gpfs_dell1_nco_ops_com_gfs_prod_gdas.YYYYMMDD_CC.gdas_restart.tar

cycled

There are 18 tarballs to pull (9 for SDATE and 9 for GDATE (SDATE-06)):

HPSS path: /NCEPPROD/hpssprod/runhistory/rhYYYY/YYYYMM/YYYYMMDD/

Tarballs per cycle:

gpfs_dell1_nco_ops_com_gfs_prod_gdas.YYYYMMDD_CC.gdas_restart.tar
gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp1.tar
gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp2.tar
gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp3.tar
gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp4.tar
gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp5.tar
gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp6.tar
gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp7.tar
gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.YYYYMMDD_CC.enkfgdas_restart_grp8.tar

Go to the top of your COMROT/ROTDIR and pull the contents of all tarballs there. The tarballs already contain the needed directory structure.

Example for SDATE 2019090900 using the hpsstar utility:

cd /scratch1/NCEPDEV/stmp4/Joe.Schmo/comrot/mytest
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190909_00.gdas_restart.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190909_00.enkfgdas_restart_grp1.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190909_00.enkfgdas_restart_grp2.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190909_00.enkfgdas_restart_grp3.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190909_00.enkfgdas_restart_grp4.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190909_00.enkfgdas_restart_grp5.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190909_00.enkfgdas_restart_grp6.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190909_00.enkfgdas_restart_grp7.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190909/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190909_00.enkfgdas_restart_grp8.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_gdas.20190908_18.gdas_restart.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190908_18.enkfgdas_restart_grp1.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190908_18.enkfgdas_restart_grp2.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190908_18.enkfgdas_restart_grp3.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190908_18.enkfgdas_restart_grp4.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190908_18.enkfgdas_restart_grp5.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190908_18.enkfgdas_restart_grp6.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190908_18.enkfgdas_restart_grp7.tar
hpsstar get /NCEPPROD/hpssprod/runhistory/rh2019/201909/20190908/gpfs_dell1_nco_ops_com_gfs_prod_enkfgdas.20190908_18.enkfgdas_restart_grp8.tar

Warm starts (from pre-production parallels)

Recent pre-implementation parallel series was for GFS v15 (Q2FY19):

  • What resolution are warm-starts available for? Warm-start ICs are saved at the resolution the model was run at (C768/C384) and can only be used to run at the same resolution combination. If you need to run a different resolution you will need to make your own cold-start ICs. See cold start section above.
  • What dates have warm-start files saved? Unfortunately the frequency changed enough during the runs that it’s not easy to provide a definitive list easily.
  • What files? All warm-starts are saved in separate tarballs which include “restart” in the name. You need to pull the entirety of each tarball, all files included in the restart tarballs are needed.
  • Where are these tarballs? See below for the location on HPSS for each Q2FY19 pre-implementation parallel.
  • What tarballs do I need to grab for my experiment? Tarballs from two cycles are required. The tarballs are listed below, where $CDATE is your starting cycle and $GDATE is one cycle prior.
    • Free-forecast:
      • ../$CDATE/gfs_restarta.tar
      • ../$GDATE/gdas_restartb.tar
    • Cycled w/EnKF:
      • ../$CDATE/gdas_restarta.tar
      • ../$CDATE/enkfgdas_restarta_grp##.tar (where ## is 01 through 08) (note, older tarballs may include a period between enkf and gdas: "enkf.gdas")
      • ../$GDATE/gdas_restartb.tar
      • ../$GDATE/enkfgdas_restartb_grp##.tar (where ## is 01 through 08) (note, older tarballs may include a period between enkf and gdas: "enkf.gdas")
  • Where do I put the warm-start initial conditions? Extraction should occur right inside your COMROT. You may need to rename the enkf folder (enkf.gdas.$PDY -> enkfgdas.$PDY).
Time Period Parallel Name Archive Location on HPSS
(/NCEPDEV/emc-global/5year/...)
Real-time
(05/25/2018 ~ 06/12/2019)
prfv3rt1 .../emc.glopara/WCOSS_C/Q2FY19/prfv3rt1
2017/2018 Winter/Spring
(11/25/2017 ~ 05/31/2018)
fv3q2fy19retro1 .../Fanglin.Yang/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro1
2017 Summer/Fall Part 1
(05/25//2017 ~ 08/31/2017)
fv3q2fy19retro2 .../emc.glopara/WCOSS_C/Q2FY19/fv3q2fy19retro2
2017 Summer/Fall Part 2
(08/02//2017 ~ 11/30/2017)
fv3q2fy19retro2 .../Fanglin.Yang/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro2
2016/2017 Winter/Spring
(11/25/2016 ~ 05/31/2017)
fv3q2fy19retro3 .../Fanglin.Yang/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro3
2016 Summer/Fall Part 1
(5/22/2016 ~ 08/25/2016)
fv3q2fy19retro4 .../emc.glopara/WCOSS_C/Q2FY19/fv3q2fy19retro4
2016 Summer/Fall Part 2
(08/17//2016 ~ 11/30/2016)
fv3q2fy19retro4 .../emc.glopara/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro4
2015/2016 Winter/Spring
(11/25/2015 ~ 05/31/2016)
fv3q2fy19retro5 .../emc.glopara/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro5
2015 Summer/Fall
(5/03/2015 ~ 11/30/2015)
fv3q2fy19retro6 .../emc.glopara/WCOSS_DELL_P3/Q2FY19/fv3q2fy19retro6