Skip to content
Go to file

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

World Heritage Analyses

The World Heritage Analyses is a digital lab (hereafter referred to as the Lab) by the IUCN World Heritage Programme to utilise the power of the web to effectively deliver data, information and knowledge on natural World Heritage sites.

Index of prototypes


We care deeply about the well-being of our rich natural heritage, which have increasingly become under accelerating threats. More rapid, accessible and easily digestible digital information, in particular through the provision of GIS and remote sensing, is urgently required to broaden our understanding as well as to inform better decision making.

Digital products are increasingly becoming the de facto preferred means through which the public access information. Rather than producing papers and publications that are difficult to understand and time consuming to read, we want to better connect with our audience by improving user experience.

With little resources, we need to keep things simple, but focus on the most important part, one at a time. We try our best to make one thing work, and hopefully to make one thing work well without investing much time. This is only achievable because we stand on the shoulders of giants and benefit from those who have gone there and have done all the hard work, already. For example, our near-real time satellite images for natural sites rely on esri's Landsat Image Service, the analyses of forest loss and human footprint are derivative works based on research by the Hansen forest loss data, and Human footprint data by the University Queensland and Wildlife Conservation Society.

We believe in the principle of open data, and an open source approach, so everything we do can be easily reproduced, scrutinised, and extended with absolute freedom. This is the future, and we owe our thanks much to those who inspire us to head this way.

More specifically the Lab aims to be:

  • A central hub where all proof-of-concept ideas and work-in-progress projects, ideas of small and big, are tried, tested and presented.

  • An entry point to access data, analyses, findings, and prototype tools,

  • A mechanism to gather feedbacks to engage users, whom we rely on to guide us in the right direction so we deliver data, information and knowledge that are useful and fit for purpose.

The Lab itself

This page also hosts the source code behind the World Heritage Analyses itself, with the design inspired by Start Bootstrap, Stylish Portfolio


Here we present a collection of studies, analyses and prototype tools that have been, or are being, developed using affordable technology, existing scientific research and data products to provide perspectives for natural World Heritage sites. We hope by sharing these work-in-progress as soon as possible through the web, so we are able to listen to you, accommodate your thoughts, mobilise resources, and to ensure effort is only spent on things that are useful.

1. Land cover change


The first comprehensive land class mapping exercise for all natural World Heritage sites and their change over time. It is also the first time the web was used to deliver the findings, instead of a text heavy report. The result has the potential in identifying threats manifested as change in land cover.


Accurate global land cover remains a well-known challenge, let alone comparable land cover change over time. Notable global land cover data products include the Global Land Cover for the year 2000 (GLC2000) at 1km spatial resolution, European Space Agency's (ESA) GlobCover 2009 at 300m resolution, more recently the GlobalLand30 land cover datasets represent the potential to enable comparisons to be made at the site level globally, for natural World Heritage sites, between 2000 and 2010.

We analysed based each land cover pixel in 2000 and 2010 for each natural World Heritage site and tracked their transition. For example, a pixel identified as forest in 2000 and then the same pixel was then classified as grassland. By aggregating the number of these pixels and the area they represent, we were able to derive not only the total area of each land cover class, but also estimate the conversion between land cover classes.

As with any results derived from global analyses, despite the nature of consistency, the finding may not capture every detailed difference. Additionally the result may also reflect the varied quality and accuracy of the underlying land cover data, which are dependent on the characteristics of the samples and methodology used in the classification. These are beyond the control of our work here. One further caveat is that although it is possible to quantify the extent and change of land covers, the reason behind the observed change requires further interrogation, ideally with the support of auxiliary information or expert systems. For example, the change of forest to grassland may be anthropogenic, such as a result of clearing for grazing; or it could be due to fire, a natural disturbance in ecological successions. While it is not advisable to use the finding of this study to directly inform monitoring per se, it could be used as an independent source of information to identify areas that may merit further investigation, as part of an early warning system.

  • Result: GlobeLand30 data 2000-2010, for all natural sites up to 2015. 10 classes change matrix (sankey diagram visualisation) on the web portal to easily visualise trend of change.
  • Issues: results not validated with unquantified quality and accuracy across the world
  • Tech stack: Python (arcpy and numpy), GDAL, Web2py, Bootstrap, jQuery, D3js, D3js-Sankey plugin, leaflet. Data courtesy of National Geomatics Centre of China.

Land cover change website | Source on Github

2. Forest Loss and Human Footprint


The first exercise of amplifying the impact of findings of peer-reviewed journal papers by complementing with web based portals for site specific information. We hope by doing so we overcome the barrier of inaccessible science (be it too difficult to understand or simply unwieldy in real life use) and project useful information directly to the hands of those who may actually use them. We also attempted for the first time a comment system as a way of channelling user feedback and also to encourage discussion and debate, to further enhance our own understanding, for example, in understanding the reason for the observed change in forest and human footprint.


Accompanying the forest loss and human footprint paper on natural World Heritage Heritage, they hold site specific information that can be of use to practitioners on the ground, as well as a means through which to verify results and gain site level insight beyond information from a global data driven approach. The site specific information is not published in the paper, in which only global statistics are given.

  • Result: Derived work from Hansen forest loss data, and Human footprint data by the University Queensland and Wildlife Conservation Society; quantitative breakdowns and with maps, to track change over time.
  • Issues: only forest loss but not deforestation; no distinction between primary forest and plantation; change may be a result natural process (for example habitat succession), and it is not without difficulty in interpreting the result with contextual information and on the ground knowledge.
  • Tech stack: Pelican, Bootstrap, Disqus comments.

Forest Loss | Source on Github

Human Footprint Change | Source on Github

3. Climate change and species vulnerability


The first data analytics exercise involving version control, open, accessible, reproducible methods. It represents a brand new way of analysing data, delivering results, and communicating knowledge products, throughout its life cycle from inception to deployment, embodying a good example of the digital by default approach. The result can be replicated step by step (see below links for detail), provided the same data is used. This open approach enables interested users to both critique the methodology as it is completely transparent to the every detail, and also further work to improve the analysis, collaboratively.

The extensive use of charts and graphs that replace narrative explanations makes it easier and simpler for users to access findings.


This work is based on a global study on species climate change vulnerability (Foden 2012, see below link).

Species vulnerability results were transferred to sites based on the spatial relationship between RedList species and natural World Heritage boundaries, using a sensitivity tested overlap threshold. As a well known caveat, the Extent of Occurrence (EOO) range polygons in the IUCN RedList database is a poor estimate of true species distribution with unaccounted omission and commission errors, nevertheless, they remain the only globally consistent data from comprehensive assessments.

By summing the number of climate change vulnerable species by their Sensitivity, Low Adaptability and Exposure scores, we are able to obtain a climate vulnerability assessment from the point of view of species for each natural World Heritage site. Since the scores are relative within each comprehensively assessed taxon, we conducted the analysis by treating amphibians, birds and warm water reef building corals separately. Furthermore, the aggregates statistics uncovered each trait and exposure detail and such information may be most relevant and useful to specific climate change adaptation monitoring requirement and policies on mitigation. As a result, users will be able to derive information such as: the prevalent cases of low adaptability is due to long generation length for a number of birds.

  • Result: Derived work from Foden 2012. Entirely open, including data, method and results and reports, all of which are available online
  • Issues: review process is complicated for those not familiar with a version control system such as Github.
  • Tech stack: web2py, JSON service-based for future extendibility, D3js

Climate change vulnerability | methodology and report | Source on Github

4. Near real-time Landsat 8 imagery


This is the first time that remote sensing images (Landsat 8) are 'collated', and displayed interactively for all natural World Heritage sites at near real time.

Additionally, the first application built on external web services (esri's Image Services). This is a significant step forward: by doing so, we free ourselves from the laborious, difficult, expensive, and non essential tasks of tracking Landsat 8 imagery as they become available, downloading and processing images, and uploading them to a web based portal for viewing. This results in an end product which can then focus on presenting Landsat 8 images, in near real time, for natural World Heritage that self updates and requires no maintenance and little computation power.


This application evolved from a pilot study implementing a custom built system that tracks an online catalogue of Landsat 8 images on Amazon, for a small number of natural World Heritage sites. The system automatically downloads new images when they become available and if they meet certain quality criteria. Then, scheduled tasks are carried out to process the images, such as pre-processing, band combinations and pan-sharpening, and finally updates an internal database and archives for future use. The vision of using these archived imagery, such as visualising on the web, was a remote goal, as significant resources and infrastructure are required.

As a proof of concept, time series Landsat 8 images are now displayed in juxtaposition to allow visual interpretation of change over time. This is aided by the functionality of dynamically changing band combinations on the fly. The flexibility of the image services means that any time sequence can be selected and visualised easily - a sustainable way to expand functionality without major re-engineering.

The benefits of rear real time satellite images are multitude. An obvious one is the possibility of 'seeing' World Heritage sites from space, back in time and also present, some of which remain inaccessible due to their remote location and hugely vast size. This could help experts with significant experience on the ground to scale up their knowledge and apply across the site. It also enables large scale analysis, previously unavailable or expensive, to generate new insights, for example, in identifying possible threats such as long term land cover conversion trend, or in detecting disturbances such as fire.

It is worth noting that, despite their vast potential, they do not replace the need for ground based surveys and monitoring missions well established in statutory process, due to limitations such as resolution and the subsequent difficulty of high fidelity analysis. Like land cover change, it may be possible in the future to eventually detect every change in pin-point accuracy, however, it will not explain the reason why by itself alone.

  • Result: Using time series data for WH with near-real time monitoring, in a more accessible and timely way.
  • Issue: relies on external Landsat services, currently for views only and very limited analytical function, for example dynamic band combination.
  • Tech stack: esri web services and esri calcite framework

Landsat 8 imagery for natural World Heritage | Source on Github

5. World Heritage Boundary


The natural World Heritage Boundary is now a data service that everyone can download, visualise and integrate in their own work


The natural World Heritage boundary evolved from a KML file with patchy point and polygons (the very original came from the UNESCO World Heritage Centre). In 2011-2012, a major overhaul was commissioned that saw significantly improved data quality, with more than three quarters of the boundaries corrected, replaced, updated or completely re-mapped, in cases where boundaries were reverse-engineered from large scale paper maps. Since then, IUCN continues to invest in the dataset, by incorporating newly inscribed natural sites, and by ensuring the best available boundary is included in the database. This data is also part of the World Database on Protected Areas (WDPA).

Spatially explicit boundary is a fundamental requirement when it comes to protected area evaluation and effective monitoring, especially in the context of natural World Heritage Convention. The Outstanding Universal Values, which underpin the World Heritage status, requires a well defined geographical boundary, without which no assessment could be made on value justification, protection and management nor integrity requirements. A comprehensive and accurate database of these boundaries allows scientifically robust scrutiny of the distribution of natural sites, their comparative representation and irreplaceability, crucially important for the credibility of the Convention. Unfortunately the Convention does not require the submission of GIS data as a requirement for the nomination, albeit the technology has been unanimously adopted and widely used.

In term of monitoring and policy guidance, most large extractive companies have pledged that no active operation nor exploration should take place within natural World Heritage sites. Having a GIS boundary facilitate that understanding and leaves no room for ambiguity. World Heritage boundaries have supported many scientific and research initiatives, including the World Heritage Outlook, Thematic Studies, and World Heritage and benefits, to name a few.

Please see below for various links to the boundary (no commercial use and no re-distribution).

Natural World Heritage Viewer | esri Feature Service | REST end point

6. Global spatial comparative analysis


The first attempt to replicate a well established analysis online, with the aim of enabling those who wish to undertake an analysis themselves. This web based prototype application is based on an open source GIS.


The spatial comparative analysis online prototype is a proof-of-concept online web application that provides a first screening of comparable sites and identifies broad scale gaps, according to datasets of widely agreed global biogeographical classifications and biodiversity conservation priorities. The tool will enable any interested parties to carry out an initial evaluation of their proposed biodiversity site on their own. It empowers non specialists so that they understand the logical comparison framework and have the tools to test their own draft nominations, and it is our hope that by doing so, it potentially reduces the risk of unsuitable sites being submitted and saves valuable resources. At the moment, this proof of concept product only allows the comparison of samples of global biogeography, broad conservation priorities, and important sites that are of great biodiversity values

  • Result: a prototype proof-of-concept application that replicates the desktop version of global comparative analysis, for biodiversity nominations at UNEP-WCMC.
  • Issue: only three example datasets are included and that no proper spatial analysis is undertaken.
  • Tech stack: postgres, postgis, leaflet, bootstrap

Comparative analysis | Source on Github

7. World Heritage information sheet


This initiative represents a major overhaul to modernise World Heritage information sheet, by making them easily accessible online. It uses a web standard format markdown, converted from Word.


The relevance and use of World Heritage information sheets have fallen out of favour, yet the content is a surviving snapshot of the site at the time of inscription and/or latest modification (such as major boundary change or changes introduced by re-nominations). Despite years of discussion about scrapping it for good, on the contrary, significant use cases have emerged with a strong support to make them more useful

The value is obvious and widely acknowledged. They provide a concise, consistent record of relevant synthesised information since the birth of the Convention and are used in a variety of ways, for examples in the global comparative analysis, quick references by interested parties, and even World Heritage Outlook assessments. Despite this, these datasheets have been in an awkward position as alternative sources of information are increasingly available in a digital form and accessible via the UNESCO World Heritage Centre website. Avoiding the contentious topic of updates and maintenance, which has been made clear that it demands a proper debate on its usefulness and subsequently funding, the focus should be on how such records could be made easily accessible. The web and in particular via mobile devices is the most popular approach.

  • Result: a web based interface for viewing information sheets.
  • Issue: waiting user feedbacks and deployment
  • Tech stack: Pelican, Bootstrap, Pandoc (for conversion), python re library

Information sheet | Source on Github

8. Global surface water transition on Google Earth Engine


A test of analysing surface water transition - utilising the power of Google's geospatial computing platform - Google Earth Engine. All calculation is only done at the time of request (when you visit the site).


The power of Google Earth Engine has seen several high impact papers, including global forest loss and global surface water, highlighting the possibility of its use in the conservation community, at the planetary scale.

This offers exciting opportunities for natural World Heritage. Although our scope is global, with limited resources, we are unable to undertake any any primary research, i.e., any research that is based on original raw data (for example, satellite imagery) for each and every site. What we have done so far, is to make use of analyses that have been carried out by research institute and offer a picture specific to World Heritage sites. The main obstacle is the global nature of these sites - it is possible to do case studies for some but to applying the same methodology across vastly different geography across the globe will quickly outrun our capacity to deliver.

This is where Google Earth Engine shines - the ability to scale up without compromise and with considerably less resources.

In our test analysis, we looked at the transition of surface water, which is one of the many datasets of the Global Surface Water, for every natural World Heritage site, documenting changes in water state between the first year and the last year of observation.

Global surface water | Source on Github

  • Result: a mock up web based interface for showing the capability of GEE
  • Issue: first draft
  • Tech stack: Google Earth Engine, Google App Engine


Digital knowledge lab for natural World Heritage sites




No releases published


No packages published
You can’t perform that action at this time.