loleg edited this page Apr 28, 2012 · 20 revisions
Clone this wiki locally

This is a project created at Space Center EPFL, Lausanne, Switzerland during the International Space Apps Challenge 2012 and submitted as a solution proposal for the Planetary Data System Challenge:

Develop a tool for citizen scientists, educators, and students to access NASA’s Planetary Data System data sets, which is available at http://pds.nasa.gov/

A presentation by Dr. Anton Ivanov in Lausanne provided our team with the basic background (page 7) on PDS and how to access the HiRISE Mars imaging project (also see: Wikipedia article) through it. We set a goal to create a mobile web app that allows users on modern smartphones and tablets to browse the very high resolution map images of Mars with a Web Standards-based platform on top of which it would be possible to develop creative educational/citizen science applications.

Demo showing HiRISE Mars terrain overlaid with satellite imagery of the EPFL: http://spacecenter.utou.ch/hirise/

This is an open source project published under the MIT License.



  • Dr. Anton Ivanov, EPFL
  • Dr. Prasenjit Saha, University of Zurich
  • Dr. Franziska Oeschger, Oxford University


  1. Get the list of all images from http://hirise-pds.lpl.arizona.edu/PDS/RDR
  2. Process the JP2 format, programmatically extract a tileset of each image
  3. Web hosting of the tiles in a mobile HTML5 application to browse the map
  4. Extraction of relevant Mars metadata (geo-coordinates, features, etc.) from PDS
  5. Visualization of terrestrial objects as an overlay on the map for educational purposes

Our initial goal was to access the open data sets, in this case a combination of metadata labels in structured text format, and highly detailed images in various resolutions - from <1 MB JPEG thumbnails to 1900 MB full size JPEG2000 maps. The reason these maps are so huge is because the HiRISE experiment orbiting Mars takes photographs with a resolution of 25 cm. It is possible to see large boulders on these maps, and even used to find all the Mars landers and even detect broken solar panels on one of them (citation needed).

Working with the image data proved to be a real challenge: despite working on current MacBook Pros and having access to a high bandwidth network at the EPFL, we struggled to find a way to process the full-size images in a reasonable time frame. David's machine regularly crashed as he tried one tool after another to read the files. Sometimes the applications would hang after 15-30 minutes of processing, and we quickly used up the full extents of the generous VPS compute instance provided to us by Evolucix. In the end we could not come up with a system that works reliably with images greater than 300 MB in size. Lessons learned: have compute clusters and graphics workstations available before attempting to work with a project like this. Also, scientific data and commercial grade processing tools don't always make the best match.

On the mapping side, Steven and Oleg deployed a standard solution based on the OpenStreetMap community OpenLayers libraries and tools. The challenge here was to understand how to work with space data with usually Earth-bound Geographical Information System (GIS) tools. We got some help from experts who tried to explain to us how the planetary radii and centroids should help us obtain a mapping of the coordinate systems, but this was all well above the our expertise. In the end we had to use lots of guesstimates and experiment with the map boundaries until we could get some kind of overlap, which was arguably un-scientific. Given our concerns with the enormous amount of disk space needed to store and process the images, Oleg set up a TileCache server to distribute and manage the tile processing workload.

To parse the metadata, Onja wrote a PHP script that made remote calls to the Planetary Data System servers, parsed the structured text, and output XML, the goal being RDF/XML. This was a more straightforward part of the project, but one which we gave too low a priority. In retrospect it would have better idea to start with this and use programmatic (API / semantic web) access to the data on the images and objects we were working with, boosting the educational aspect of our solution significantly. A good reference on how this is currently used is the ENVI User Guide, a metadata browser capable of reading HiRISE data.

The easiest part of the solution was the mobile web app itself. Oleg quickly whipped up a jQuery Mobile framework and tweaked OpenLayers to work well inside of it. It was OpenLayers already excellent support for taps, swipes and pinches which allowed the mobile maps to look and feel as good as a native maps app.

We had lots of fun working on this Space Apps Challenge, and though the summit remains well above the clouds, we all learned much from it. We are thankful to have had a chance to have a real space science experience - both of the incredible opportunities it creates, and of the frustratingly difficult challenges that are involved when breaking new ground. We are endeavoring to complete the five goals outlined in the Process, and wish the HiRISE team and researchers around the world working with this exciting project the best of success in an effort to become more familiar with the surface of another planet than we are of our own.



  • Building the tile images (offline processing) script in Scala, using the Java ImageIO library to crop and scale the tiles, and the jai-imageio extension for handling JPEG2000 files
    • still having problems with image files bigger than 300 Mb
    • tried imagemagick, jasper, jai-imageio, openmlib, several desktop tools
    • StackOverflow community is really collaborative
  • Map visualization: OpenLayers workshop
  • Metadata extraction: regular expressions on remote HTTP requests in PHP
  • Mobile app: jQuery Mobile