Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating boundary condition data sources #6

Open
durack1 opened this issue Jun 29, 2015 · 0 comments
Open

Updating boundary condition data sources #6

durack1 opened this issue Jun 29, 2015 · 0 comments
Milestone

Comments

@durack1
Copy link
Contributor

durack1 commented Jun 29, 2015

There is a need to reconsider the method of how these datasets are generated.

Firstly to:

  • Update the current dataset using the latest versions of HadISST, and NCEP-OI2 to ensure that all known problems are resolved, particularly in the historical coverage of the data (issues have been documented on each of the pages)
  • Consider investigating the latest data products (particularly more modern satellite SST datasets) to generate boundary conditions at higher spatial (and temporal resolution for a subset in time, e.g. 1980-preset)

Some discussions have already started with other MIP contributors and the dialogue is included below - I'm tagging @taylor13, @gleckler1 for ref:

Subject:    RE: PRIMAVERA / HighResMIP boundary conditions
Date:   Fri, 24 Apr 2015 08:33:24 +0000
From:   Roberts, Malcolm
To: Karl Taylor
CC: Gleckler, Peter, Rein Haarsma

Dear Karl, Peter (cc Rein),
Thanks for your email. I could not agree more that we do not want to duplicate our efforts, especially since we (HighResMIP/PRIMAVERA) have a very tight deadline to have the protocol set up, ideally for testing later this summer, and for real at the latest by the end of the year.

Yes, certainly we have used the PCMDI boundary condition extensively in the past and know that it is the standard for CMIP AMIP-style runs. However, I think we have several extra demands for the forcing dataset for HighResMIP:

1. We want high resolution as standard (that is, we want to provide the high resolution data to groups, and allow groups to interpolate to lower resolutions as necessary). From personal experience, interpolating the 1 degree dataset to a higher resolution (as well as of course not being able to add detail) needs to be done in a careful way so as not to introduce anomalous features (specifically in the gradients of the field). 

2. Additionally, SST gradients such as those around boundary currents become very important as model resolution is increased and is able to "feel" such gradients.

3. As you say, we want to produce a smooth dataset into the future to 2050. This is a new and interesting challenge that we're only beginning to get our heads around, but we have some ideas for how to do it.

4. We would like daily data, since we expect that, as resolution increases, the air-sea interaction at shorter timescales may well become more important. Of course we can only have "simulated" daily variability in the dataset.

So given these requirements, I talked with Nick Rayner and John Kennedy here at the Met Office. Specifically this is what John said about the process for HadISST:
"
The conversion from 1 degree to 0.25 degree and from monthly to daily happens when we still have anomalies. The interpolated anomalies are then added to the high resolution climatology.

We didn’t use linear interpolation of the monthly data because it introduces a monthly cycle in the variance of the anomalies. The mid month points have higher variance than say the first and last days of the month. It’s easy to see why because the first day of the month will be an average of the two mid-month points either side of it, so it will have a lower variance. We used a cubic interpolation instead which can be tuned to give a more consistent variance, though it still isn’t perfect.
"

From this I understand that I can attempt to use the anomaly field to construct the full 1950-2050 dataset (by stitching an earlier period onto the end of the present day period, matching up phases of global modes as far as possible), and then we can use the HadISST machinary to construct the daily, 1/4 degree dataset for the whole period.

We have a couple of ideas for the sea-ice forcing too for the future period, but that is a little less developed.

I think the good part of this is that we will be able to have a comparable overlap period. The idea is that the HighResMIP low resolution simulation will use basically the same model as used in DECK simulations (though with the different HighResMIP forcing datasets, configuration, etc), so that we will have a common period (1978-2008 I guess) to compare our simulations to DECK, potentially learning about the impact of our protocol too.

I hope that this makes sense, and I hope we can keep in touch as we develop our datasets.

Thanks,
Malcolm & Rein

Malcolm Roberts    Manager, High resolution global climate modelling
Met Office Hadley Centre
FitzRoy Rd, Exeter, Devon EX1 3PB, UK
Tel: +44 1392 884537     Fax: +44 1392 885681
email: malcolm.roberts http://www.metoffice.gov.uk/research/people/malcolm-roberts

Further ideas/comments can be captured in this issue.

@durack1 durack1 added this to the 2.0.0 milestone Apr 19, 2017
@durack1 durack1 changed the title Updating boundary condition data sources and method Updating boundary condition data sources May 2, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant