Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regarding Preprocessing using create_uv.perl file #25

Closed
vinni94 opened this issue Sep 26, 2022 · 10 comments
Closed

Regarding Preprocessing using create_uv.perl file #25

vinni94 opened this issue Sep 26, 2022 · 10 comments
Assignees
Labels
help wanted Extra attention is needed

Comments

@vinni94
Copy link

vinni94 commented Sep 26, 2022

Hi,
This issue is raised because. while compiling fortran document in the perl file create_UV.perl, it uses a linked library path something like L/home/user/Documents/NCEPLIBS-bacio-develop/NCEPLIBS-w3emc-develop/w3emc/lib -lw. this throws an error 'undefined -lw'. I am not able to figure out which library is this trying to use.
After some research I found this NCEP libs used along with fortran programs called bacio and w3emc. I have installed these libraries. however even when I link the lib directories of these installations the error remains the same. Please help

@CharlesZheZhang CharlesZheZhang self-assigned this Sep 27, 2022
@CharlesZheZhang
Copy link
Collaborator

Hi Vinni,

It looks like you are using the GLDAS forcing. Please note that the GLDAS forcing data switched from grib to netcdf format two years ago. For this change, now there is a separate script, create_UV_netcdf.perl, to deal with the wind component. Can you try this Perl script and see if it works?

The details about pre-processing GLDAS in netcdf format are documented in this file:
https://github.com/NCAR/hrldas/blob/master/hrldas/docs/README.GLDAS

@vinni94
Copy link
Author

vinni94 commented Sep 27, 2022 via email

@vinni94
Copy link
Author

vinni94 commented Sep 27, 2022

Here are the last few lines of error:

flnm = geo_em.d01.nc
 Done with subroutine read_geo_em_file
 Date = 2018-01-03_00  ihour =            0
             :  Checking for file '/media/user/Elements/GLDAS_data/2018/noahmp_extracted/Tair/GLDAS_Tair_f_inst_2018010300'
             :  Found file /media/user/Elements/GLDAS_data/2018/noahmp_extracted/Tair/GLDAS_Tair_f_inst_2018010300
 Succesfully read the file:
 /media/user/Elements/GLDAS_data/2018/noahmp_extracted/Tair/GLDAS_Tair_f_inst_20
 18010300


 Can't read variable:
 Tair_f_inst



 Returning error flag from get_single_datastruct_from_netcdf (1)
 flnm =
 /media/user/Elements/GLDAS_data/2018/noahmp_extracted/Tair/GLDAS_Tair_f_inst_20
 18010300
 ierr =          -57
Field label= T2D
No previous data.

@CharlesZheZhang
Copy link
Collaborator

Hi Vinni,
This error message basically means that the code had an error when reading the variable in the netcdf file, so it returned an ierr /= 0.
Can you check that all variables are extracted properly from GLDAS in the previous steps? Also, you can attach the files and the namelist to this issue. See if I can help you with this.

Regards,
Zhe

@vinni94
Copy link
Author

vinni94 commented Sep 27, 2022

Dear Zhezhang, Thanks for the help.
I am hereby enclosing the following docs:

  1. GLDAS sample files after using all the new perl scripts for netcdf processing.
  2. Namelist file along with geo_em.d01 file for my domain. Also, i have attached a small test fortran file to check whether the variable is being read properly (named test_nc.F : compiled and linked with netcdf 4.8.x version). Its reading properly here but not in the create_forcing_netcdf.F file.

GLDAS_sample_data.zip
namelist_geo_em.zip

@vinni94
Copy link
Author

vinni94 commented Sep 27, 2022

PS: Should global GLDAS need to be downloaded for forcing? I have used subsetted GLDAS data only for my region.

@CharlesZheZhang
Copy link
Collaborator

Hi Vinni,

This is the case - the read_netcdf_unit subroutine in the create_forcing_netcdf.F code requires exactly the same dimension (600x1440) for global coverage.
Please try with the global data and see if it works.

Zhe

@vinni94
Copy link
Author

vinni94 commented Sep 27, 2022

That's great. Thank you Zhe. 😄
I will try doing the same with global data.

@vinni94
Copy link
Author

vinni94 commented Oct 6, 2022

Dear CharlesZheZhang,
Thanks for your help so far. I was able to run HRLDAS successfully after I downloaded the global GLDAS files. I have a small query regarding the HRLDAS name list. I can see that there is a variable name SPINUP_LOOPS in the namelist of hrldas, which I guess, is used for model spinup. SPINUP_LOOPS has units as hours/days.

  1. Let's say I want to coldstart/spinup my model for 2 years * 20 times. What value of SPINUP_LOOPS should I give?
    After spinup, should I use the last restart file generated for my analysis runs?
  2. When I tested spinup for lets stay 16 days. the model is running each day 30 times. how can I achieve what I mentioned in question 1.?

@cenlinhe
Copy link
Collaborator

Hi Vinni, to answer your questions:

  1. set SPINUP_LOOPS=20 and KDAYS=730 (i.e., 365*2 if no leap year). After spinup, you need to use the latest restart file generated (you can check the time of each restart file generated and use the most recent one generated).
  2. The spinup capability is to repeat the run for the "KDAYS" you specified. So if you want to run each day 30 times, then you need to set SPINUP_LOOPS=30 and KDAYS=1 for spinup.

@cenlinhe cenlinhe added the help wanted Extra attention is needed label Jul 1, 2023
@NCAR NCAR locked and limited conversation to collaborators Jul 1, 2023
@cenlinhe cenlinhe converted this issue into discussion #91 Jul 1, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants