-
-
Notifications
You must be signed in to change notification settings - Fork 3
[META] Comparison of NWP Providers #31
Comments
https://research.asrc.albany.edu/people/faculty/perez/2013/forecast-se.pdf compares GFS-based WRF model to ECMWF, with ECMWF being the winner for single forecast, but the best results from averaging the outputs from both models |
https://journals.ametsoc.org/view/journals/bams/103/9/BAMS-D-21-0234.1.xml This compares quite a few operational forecasts, all intialized with the ECMWF data to reduce differences in that. Overall, the paper says ICON and IFS are the most similar forecasts globally, as ICON and IFS share a lot of the same parameterization schemes. So far:
Average Root-mean-square-dfference between all pairs of models for 3-day forecasts and over the whole globe. |
https://www.mdpi.com/2073-4433/10/9/503/htm This paper looks at a very high resolution ICON model over Western Africa and compares it to IFS and ICON-global, especially on rainfall and wind speed. Most of the comparison is between the hi-res model and IFS, but they have some comparison of ICON-global and IFS, primarily: Essentially, b, d, and f compare ICON-global and IFS RMSD for the wind fields. |
https://mdpi-res.com/d_attachment/remotesensing/remotesensing-12-03672/article_deploy/remotesensing-12-03672-v2.pdf?version=1604985994 has to do with blending ICON and IFS for solar forecasting, and weighting them. The best result is from weighting them along with an optical flow model working on satellite cloud imagery. This is doing solar forecasts over central Europe in the 1-5 hour range. Notes:
Just to keep these comparisons up here, adding some more papers here: Focused on fog forecasting: Comparison of various NWP models, for things related to fog forecasting: This one is on rainfall in southern Western Africa Model with overall best agreement with observations is ECMWF IFS, although ICON also gives similar good results. Both outperform UKMO, and COSMO models. It also has a section on low cloud biases in various models over the area of interest: IFS showed realistic cloud patterns, but underestimate cloud cover by 17%. In overall bias, best results generated by ICON and COSMO, with slight overestimations of 2-3%. ICON showed a large west-east gradient in cloudiness compared to observations, and overestimation of clouds along the coast. It also looks at radiation forecasts. IFS had a clear imprint of the low-level cloud distribution, but with too low cloud coverage, and too low optical thickness of clouds. IFS has an average of 201 Wm^-2, and bias of 51 Wm^-2, while ICON had an unrealistic east-west gradient, and bias of 43 Wm^-2, which pointed to problems with clouds at other levels or with cloud optical thickness in ICON. UKMO had the largest overestimation with 213.7Wm^-2 in the area average, and very little structure. Temperature forecasts for various models Comparison with station observations for rainfall observations: |
@dantravers @peterdudfield @JackKelly Here are some initial looks at papers that compare ICON and ECMWF in some way. The last one does it for 1-5 hour solar forecasts, while others are more broad. ICON seems to be one of the most similar forecasts to IFS, but does seem to do worse for solar forecasts compared to IFS. Another paper that included some radiation comparison in southern West Africa also gives IFS better radiation forecast, with ICON doing worse than IFS, but better than UKMO and COSMO models. |
Great thanks @jacobbieker - that's really useful. |
Perfect - thank you Jacob! |
Thanks, ive asked @devsjc to add some costs analysis in here too |
ECMWFArchive dataIt seems the archive data itself is accessible for research purposes for free; following the order through it notes an ordinary subscription cost of $3000 per year but gives a tick box option
This research license cannot be used to make a product, and isn't a given. If we get it, the cost is $0, if we don't, $3000/year. Real time dataThere is a subset of real-time data that is openly available, released with an hour's delay, which covers some parameters at a 0.4 degree resolution: https://www.ecmwf.int/en/forecasts/datasets/open-data. It does not include any radiation or cloud parameters. Creating an order of our usual set of 10 parameters covering the EU would cost $28000/year (not including handling costs or discounts)Other notesAlso worth noting a recent change to the storage location and compression of some of ECMWF's datasets: We might be able to get a reduced fee for small businesses (<10 headcount) which halves the price: |
Thanks for looking into it! For this:
Its good to note that these parameters don't include any radiation or cloud parameters in that. |
One plus point for |
Should we try to fill in something like this
|
I think only the ECMWF open data though, from looking at it, although we could probably extend it easily enough |
From my understanding then:
|
Great. Thanks all! |
Answers to questions raised in NWP meeting: Is UK only cheaper than full EU? How much would india cost? How much for a single site e.g. Rajasthan? How much to include model output statistics - ensemble ones? See below for screenshots. |
Realtime Global/European Cut Out costsCost of EU + Ensemble parameters for hcc, lcc, mcc, dswrf, dlwrf: $28000 + fees (same as EU without ensemble parameters)Interestingly, the cost for ~300Gb per year, is the same as the cost for ~30Tb per year! |
Great work @devsjc!
That is interesting! But 30 TB per year sounds like the full ensemble to me (i.e. downloading all 100 IFS ensemble members: 300 GB per member per year x 100 members = 30 TB). Instead, I'm 80% sure that ECMWF pre-computes some basic summary stats across all 100 ensemble members (mean & std, perhaps) and allows users to "just" download the mean & std (instead of downloading every one of the 100 ensemble members). So hopefully we'd end up "just" having to handle 600 GB per year (because, for each NWP variable, we'd have a mean and a std). |
Gotcha @JackKelly - so which of these options would you pick when trying to access those? This is what I get to choose from. |
The short answer is: I don't see the "ensemble mean and spread" in that list. It's possible that ECMWF don't provide summary stats (which would be odd). If ECMWF do do it, then we'd be looking for the ensemble mean and spread for the atmospheric ENS (i.e. the mean and spread across all 100 ENS members). Here's some details docs from ECMWF, which strongly suggest that ECMWF do compute the "ENS mean and spread": https://confluence.ecmwf.int/display/FUG/Section+8.1.2+ENS+Mean+and+Spread We definitely don't want SEAS. |
Actually, this ECMWF page is probably the right place to start. It gives an overview of all of the "basic ensemble products". And gives some good, concise advice. For example, on the topic of "ensemble means and spread" it says:
@dantravers @jacobbieker & @dfulu it might be worth quickly skim-reading ECMWF's overview of Basic ensemble products so we can make the most informed decision about which ensemble products we want to consume (if any!) |
This issue collates different papers and such on different NWP forecasts, ideally related to solar forecasting, and primarily with the goal of comparing ECMWF and ICON forecasts. I'll also be adding other, relevant papers as I go through them that might help in deciding which NWP models to use.
Context
We are trying to determine which NWP to use next, either as an additional one to MetOffice, or for areas where MetOffice UKV does not. The two primarry contenders are ECMWF, as its generally regarded as the best forecast system, and ICON, which has the advantage of being free, we have an archive already back to 2020, and its higher spatial and temporal resolution compared to GFS.
The text was updated successfully, but these errors were encountered: