No description, website, or topics provided.
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
images
README.md
cerrado_outline.geojson
fullforestnet_labels.png
models.py

README.md

FullForestNet: Deep learning for deforestation & forest degradation monitoring and land use intelligence

This project is a result of the DigitalGlobe GBDX for Sustainability Challenge. As one of the winners, 20tree.ai has been working on new methods to monitor large areas of land to:

  • Detect early signs of deforestation & forest degradation
  • Extract actionable insights for sustainable land use management

The main focus of development has been on building an agile new deep learning framework for earth observation (EO) on pixel-level that includes the following attributes:

  • Flexible input: any satellite image with spatial resolution between 0.30m and 10m and with 3 or more bands can be used.
  • Accuracy over speed: for most EO applications real-time processing is not of essence and a higher accuracy has priority.
  • Include temporal aspect: ability to monitor areas over time (with frequent updates) to detect patterns and increase accuracy.
  • Spatial layer: when monitoring natural resources, different zoom levels can bring different insights regarding the surrounding area.
  • Scalability: ability to use the framework to monitor the Cerrado, an area of 2 million km2, on a daily basis.

Using Recurrent Neural Networks to monitor forests over time

Background

The Cerrado in Brazil has been chosen for the pilot. While there has been a lot of attention for the Amazon, its neighbour, the Cerrado has been overlooked by many. The pressure of expanding agriculture eats away the Cerrado, piece by piece. The Cerrado biodiversity is unique, it is an extremely important source of water and large amounts of carbon are captured in the deep root systems of trees.

Cerrado - source https://api.discover.digitalglobe.com/show?id=1050410001849800 Fig 1. Cerrado - source: DigitalGlobe GBDX - https://api.discover.digitalglobe.com/show?id=1050410001849800

Goal

The goal of this project is to kickstart open innovation on the crossing of EO and AI. In recent years a lot of progress has been made in applying deep learning to computer vision tasks resulting in state-of-the-art models. These developments have also been translated to remote sensing, with open competitions on Kaggle, for example to detect deforestation, and the newly launched xView 2018 Detection Challenge. These competitions are great for machine learning practisers to get hands-on experience with high-resolution satellite imagery. However, the results and models developed are not always open for the community to use and the satellite imagery provided is not publicly accessible. Morever, we think that additional insights can be extracted when labeled data on the most granular level is provided.

By publishing our work, we want to contribute to open innovation for sustainable land use in support of the United Nations’ Sustainable Development Goals (SDGs). We invite developers, data scientist, machine learning engineers, and others interested to build further and collaborate to create actionable insights. Not only for the Cerrado, but across the planet.

Data

Satellite imagery

  • Copernicus Sentinel data 2015-2018, Sentinel-2
  • DigitalGlobe GBDX

Tree level insights

Fig 2. Very high-resolution satellite imagery or UAV can be used to retrieve insights on tree level.

Labels

For the Cerrado, there has been a lot of great work done by different organisations to monitor land use. One of these organisations is MapBiomas, yearly updates on land use and statistics are published. However, because we had access to very high-resolution satellite imagery we wanted to use more detailed labels.

Labels for Cerrado

Fig 3. Extended labels for the Cerrado

By creating labels on the most granular level, the insights are made actionable. For example, in Fig. 4 we can see how the cattle (in blue) on the pasture is labeled as well. This tells us that the land is actively being used. If we monitor this over time and combine this with the insights of the productivity of the land we can gain insights in how to get the most out of this specific piece of land. As a result, local insights can be generated on how to optimize land use between cattle and crops without clearing more forests or native vegetation. In addition, areas of forest (in olive) that are degraded are marked seperately (in teal).

Cerrado granular labeling

Fig 4. Pasture in the Cerrado with cattle on top.

These labels give additional insights in:

  • the quality of the forest: degradation and selective logging Selective logging

Fig 5. Selective logging - source: DigitalGlobe Discover API (https://api.discover.digitalglobe.com/show?id=105001000AA89900)

  • the active use of pasture Cattle ranching in the Cerrado

Fig 6. Cattle captured with 30cm resolution - source: DigitalGlobe GBDX

  • the method of deforestation Slash and burn in the Cerrado

Fig 7. Slash and burn is one of the most used methods to clear large areas of land in the Cerrado, leaving large burned spots - source: Copernicus data (2017)/ESA

  • crop type

Results

Final results will be published in the coming weeks.

Discussion & next steps

The release of the code is the first phase of this project and will be gradually extended in the coming period. The following additions and improvements can be expected:

  • Analysis on accuracy
  • Analysis on performance
  • Experiment with other data sources
  • Publishing trained weights

Getting started

Run the FullForestNet-example.ipynb Jupyter notebook for an example.

Requirements

pip install -r requirements.txt

License

  • Code (scripts and Jupyter notebooks) are released under the GPLv3 license for non-commercial and research purposes only. For commercial purposes, please contact the authors.
  • The trained weights will be released under Creative-Commons BY-NC-SA. For commercial purposes, please contact the authors.

https://creativecommons.org/licenses/by-nc-sa/3.0/

Acknowledgements

This work has been conducted by 20tree.ai and partners with the support of DigitalGlobe via the GBDX for Sustainability Challenge.