Skip to content
This repository has been archived by the owner on Nov 1, 2020. It is now read-only.

Commit

Permalink
initial posts import
Browse files Browse the repository at this point in the history
  • Loading branch information
Matthew Perry committed Apr 24, 2012
1 parent 9c00269 commit 04f83ce
Show file tree
Hide file tree
Showing 96 changed files with 4,746 additions and 411 deletions.
95 changes: 95 additions & 0 deletions _posts/2005-12-03-hello-world.markdown
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
---
date: '2005-12-03 14:25:41'
layout: post
slug: hello-world
status: publish
title: Processing S57 soundings
wpid: '1'
---

NOAA Electronic Navigational Charts (ENC) contain (among many other things) depth soundings that can be processed into raster bathymetry grids. The ENC files are available as a huge torrent from geotorrent.org ([http://geotorrent.org/details.php?id=58](http://geotorrent.org/details.php?id=58)).

Download this torrent and check readme.txt to find the chart of interest:



> Port Hueneme to Santa Barbara|5|2005-10-03|2005-10-03|US5CA65M


First check out the gdal documentation for s57 files at [http://www.gdal.org/ogr/drv_s57.html](http://www.gdal.org/ogr/drv_s57.html).

Change to the US5CA65M directory and you'll see a .000 file (and maybe .001, .002 etc). Run ogrinfo on the .000 file and you'll see ~ 61 layers, one of which ("SOUNDG") represents the soundings. Let's start by examining the soundings layer:




>
ogrinfo -summary US5CA65M.000 SOUNDG






We see that there are 43 "features" but since the features are multipoints, there are actually thousands of soundings. The multipoints are 3D so If we convert to a shapefile with ogr2ogr's default settings we loose the 3rd dimension. To solve this, we need to append "25D" to the layer type. Furthermore, the multipoint geometry confuses some applications so we want to split it into a layer with simple 3D point geometries. Luckily there is a SPLIT_MULITPOINT option that must be specified as an environment variable:




> export OGR_S57_OPTIONS="RETURN_PRIMITIVES=ON,RETURN_LINKAGES=ON,LNAM_REFS=ON,SPLIT_MULTIPOINT=ON,ADD_SOUNDG_DEPTH=ON"
ogr2ogr -nlt POINT25d test3.shp US5CA65M.000 SOUNDG






Now we get ~ 3000 3D points with the depth added as an attribute for good measure.

Now bring these into grass and create a raster:




>
v.in.ogr -zo dsn=test3.shp output=soundg layer=test3
v.info soundg
g.region vect=soundg nsres=0.001 ewres=0.001
v.surf.rst input=soundg elev=bathy layer=0
r.info bathy






since depths actually show up as positive elevations, we want to multiply the grid by -1




>
r.mapcalc sb_bathy=bathy*-1






And of course we want to make some nice shaded relief and contour maps for viewing with QGIS:





> r.shaded.relief map=sb_bathy shadedmap=sb_shade altitude=45 azimuth=315
r.contour input=sb_bathy output=sb_contour step=5
qgis &



![s57 results](http://perrygeo.net/img/s57.png)


From the screenshot, we see the pits and spikes from potential outliers so we might want to go back and adjust the tension and smoothing on the raster creation (the v.surf.rst command).
10 changes: 10 additions & 0 deletions _posts/2005-12-03-the-new-blog.markdown
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
---
date: '2005-12-03 15:16:08'
layout: post
slug: the-new-blog
status: publish
title: The new blog
wpid: '2'
---

Well I finally got around to installing some real blogging software. SimplePHP Blog was just not cutting it and WordPress looks like a healthy option. So far I've been really impressed! Let me know if you have any troubles accessing it...
68 changes: 68 additions & 0 deletions _posts/2005-12-11-kml-to-shapefile-scripting.markdown
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
---
date: '2005-12-11 22:26:56'
layout: post
slug: kml-to-shapefile-scripting
status: publish
title: KML to Shapefile Scripting
wpid: '3'
---

Christian Spanring has been doing some great work with Google Earth's KML data format. The latest offering is a fairly robust [XSLT stylesheet for transforming KML into GML](http://spanring.name/blog/2005/12/11/kml2gml/).

In the article, he mentions ogr2ogr as a method to convert GML to shapefiles so I immediately had to try it out! I came up with a simple bash script, **kml2shp.sh**, that provides a quick command-line interface:



> kml2shp.sh input.kml output.shp


Here's the step-by-step:





1. Make sure you have xsltproc (the command-line xslt processor) and OGR installed.



2. Copy the [xslt stylesheet ](http://spanring.name/blog/wp-content/files/kml2gml.xsl) to /usr/local/share/kml2gml/



3. Create the kml2shp.sh script below (make sure to change the paths to reflect your system, chmod +x it, etc)







> #!/bin/bash
if [ $# -ne 2 ]; then
echo "usage: kml2shp.sh input.kml output.shp"
exit
fi

echo "Processing KML file"
sed 's/ xmlns=\"http\:\/\/earth.google.com\/kml\/2.0\"//' $1 > /tmp/temp.kml
xsltproc -o /tmp/temp.gml /usr/local/share/kml2gml/kml2gml.xsl /tmp/temp.kml

echo "Creating new Shapefile"
ogr2ogr $2 /tmp/temp.gml myFeature

echo "Cleaning up temp files"
rm /tmp/temp.gml
rm /tmp/temp.kml

echo "New shapefile has been created:"
echo $2




Now as far as I can tell, the XSLT is fairly robust although I've only tested it on a few datasets. The wrapper script, however, could use alot of work. Type and error checking would be nice for starters and a better method to remove the xml namespace might be necessary. This is really meant as a starting point.

One potential problem with this technique is that you will most likely get a 3D shapefile (x, y AND z coordinates). Many applications can handle 3D shapefiles but some (QGIS, others?) cannot at the present time. Once the geometry type is known, one could always specify the ogr2ogr "-nlt" parameter to force 2D output. But that's all for now... let me know if anyone has any suggestions on improving this technique.
Original file line number Diff line number Diff line change
@@ -0,0 +1,112 @@
---
date: '2005-12-11 23:53:56'
layout: post
slug: tissot-indicatrix-examining-the-distortion-of-2d-maps
status: publish
title: Tissot Indicatrix - Examining the distortion of map projections
wpid: '4'
---

The Tissot Indicatrix is a valuable tool for showing the distortions caused by map projections. It is essentially a series of imaginary polygons that represent perfect circles of equal area on a 3D globe. When projected onto a 2D map, their shape, size and/or angles will be distorted accordingly allowing you to quickly assess the projection's accuracy for a given part of the globe.

I've seen great Tissot diagrams in text books but I wanted to create the indicatrix as a polygon dataset so that I could project and overlay it with other data in a GIS. To do this I wrote a python script using the OGR libraries, which I will revist in a minute. But first the visually interesting part:

Here is a world countries shapefile overlaid with the Tissot circles in geographic (unprojected lat-long) coordinates:

![Latlong tissot](/img/latlong.png)

Next I reprojected the datasets to the Mercator projection using ogr2ogr:



>
ogr2ogr -t_srs "+proj=merc" countries_merc.shp countries_simpl.shp countries_simpl
ogr2ogr -t_srs "+proj=merc" tissot_merc.shp tissot.shp tissot



Note that the angles are perfectly preserved (the trademark feature of the Mercator projection) but the size is badly distorted.

![Mercator tissot](/img/mercator.png)

Now lets try Lambert Azimuthal Equal Area (in this case the US National Atlas standard projection - EPSG code 2163).



>
ogr2ogr -t_srs "epsg:2163" countries_lambert.shp countries_simpl.shp countries_simpl
ogr2ogr -t_srs "epsg:2163" tissot_lambert.shp tissot.shp tissot



This is a great projection for preserving area but get outside the center and shapes become badly distorted:

![LAEA tissot](/img/lambert.png)

The best way to experiment with this is to bring the tissot.shp file into ArcMap (or another program that supports on-the-fly projection) and play with it in real time. The distortions of every projection just leap off the screen...

OK, now for the geeky part. Here's the python/OGR script used to create the tissot shapefile. The basic process is to lay out a grid of points across the globe in latlong, loop through the points and reproject each one to an orthographic projection centered directly on the point, buffer it, then reproject to latlong. The end result is a latlong shapefile representing circles of equal area on a globe.





>
>
>
> #!/usr/bin/env python
> # Tissot Circles
> # Represent perfect circles of equal area on a globe
> # but will appear distorted in ANY 2d projection.
> # Used to show the size, shape and directional distortion
> # by Matthew T. Perry
> # 12/10/2005
>
> import ogr
> import os
> import osr
>
> output = 'tissot.shp'
> debug = False
>
> # Create the Shapefile
> driver = ogr.GetDriverByName('ESRI Shapefile')
> if os.path.exists(output):
> driver.DeleteDataSource(output)
> ds = driver.CreateDataSource(output)
> layer = ds.CreateLayer(output, geom_type=ogr.wkbPolygon)
>
> # Set up spatial reference systems
> latlong = osr.SpatialReference()
> ortho = osr.SpatialReference()
> latlong.ImportFromProj4('+proj=latlong')
>
> # For each grid point, reproject to ortho centered on itself,
> # buffer by 640,000 meters, reproject back to latlong,
> # and output the latlong ellipse to shapefile
> for x in range(-165,180,30):
> for y in range (-60,90,30):
> f= ogr.Feature(feature_def=layer.GetLayerDefn())
> wkt = 'POINT(%f %f)' % (x, y)
> p = ogr.CreateGeometryFromWkt(wkt)
> p.AssignSpatialReference(latlong)
> proj = '+proj=ortho +lon_0=%f +lat_0=%f' % (x,y)
> ortho.ImportFromProj4(proj)
> p.TransformTo(ortho)
> b = p.Buffer(640000)
> b.AssignSpatialReference(ortho)
> b.TransformTo(latlong)
> f.SetGeometryDirectly(b)
> layer.CreateFeature(f)
> f.Destroy()
>
> ds.Destroy()
>
>
>




82 changes: 82 additions & 0 deletions _posts/2006-01-20-geocoding-an-address-list-to-shapefile.markdown
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
---
date: '2006-01-20 18:50:39'
layout: post
slug: geocoding-an-address-list-to-shapefile
status: publish
title: Geocoding an address list to shapefile
wpid: '5'
---

Most commercial software comes with fairly elaborate geocoding engines and there are nice geocoding services on the web that can do one-at-a-time geocoding but the [recent post](http://www.spatiallyadjusted.com/2006/01/20/batch-geocode-tabular-address-data-via-your-web-browser/) at Spatially Adjusted pointed out a great free resource for batch geocoding named, conveniently enough, [Batch Geocode](http://www.batchgeocode.com/). Just give it a list of tab or pipe delimited addresses and it outputs a table with your original data plus a lat/long for every row.

I have been working on a python script to convert text files into point shapefiles and thought this would be a great chance to put it to work. The only dependency is a recent version of python with the ogr module (see [FWTools](http://fwtools.maptools.org) for an easy to install package for windows or linux).

First, I take a list of cities and feed it to batchgeocode.com (a very nice feature is that the yahoo geocoder, on which batchgeocode is based, does not _require_ street level addresses):



> City|State
> Santa Barbara|CA
> Arcata|CA
> New Milford|CT
> Blacksburg|VA


After running the geocoder, I get back a table with lat/longs:



>
> City|State|lat|long|precision
> Santa Barbara|CA|34.419769|-119.696747|city
> Arcata|CA|40.866261|-124.081673|city
> New Milford|CT|41.576599|-73.408821|city
> Blacksburg|VA|37.229359|-80.413963|city


Copy and paste that into a text file and add a second header row that defines the data type for each column. It would be possible to autodetect the column types but there are cases where a string of numeric digits should be kept as a string (for instance the zipcode _06776_ would become _6776_ if it was read as an integer).The possible column types are _string, integer,real, x_ and _y_ with x and y representing the coordinates.



>
> City|State|lat|long|precision
> string|string|y|x|string
> Santa Barbara|CA|34.419769|-119.696747|city
> Arcata|CA|40.866261|-124.081673|city
> New Milford|CT|41.576599|-73.408821|city
> Blacksburg|VA|37.229359|-80.413963|city


Now run the _txt2shp.py_ utility. The input and output parameters are self-explanatory and the d parameter defines the string used as a delimiter. Notice that the syntax follows the GRASS standard of _parameter=value_:



>
> txt2shp.py input=cities.txt output=cities.shp d='|'


And now you've got a shapefile of the geocoded cities!

![Cities Shapefile](http://perrygeo.net/img/cities.png)

The txt2shp.py script can be downloaded [ here](http://perrygeo.net/download/txt2shp.py). Try it out and let me know how it's working for you.

**Update:** In order to generate a .prj file for your output shapefile, you can use the epsg_tr.py utility if you know the EPSG code. Batch Geocoder returns everything in lat/long (presumably with a WGS84 datum?) so you can use EPSG code 4326:



> epsg_tr.py -wkt 4326 > cities.prj









16 changes: 16 additions & 0 deletions _posts/2006-01-24-mexico-us-border-crossing-maps.markdown
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
---
date: '2006-01-24 16:42:10'
layout: post
slug: mexico-us-border-crossing-maps
status: publish
title: Mexico-US Border Crossing Maps
wpid: '25'
---

Thousands of Mexicans come to the United States every year and, besides the legal troubles of border crossing, they face a tough journey across the desert in order to reach their destination. They have very little information to go on and many die from dehydration as they attempt to find their way through the vast deserts of the american southwest.

A faith-based organization called [ Humane Borders](http://www.humaneborders.org/about/about_index.html) is trying to help the situation. They have produced [a number of maps](http://www.humaneborders.org/news/news4.html) documenting town locations, roads, water stations, walking distances, cell phone towers and even places where other immigrants have died along the way. This was made possible in part by GIS software donated by ESRI.

I heard today on CNN with Lou Dobbs that the Mexican government is now printing and distributing these maps to citizens. Though the maps will clearly state "Don't Do It! It's Hard! There's Not Enough Water!", critics are saying the maps aid criminals and will enourage illegal aliens to cross the border. Others have pointed out that, from an economic standpoint, this may benefit the US border patrol since so much of their budget is devoted to aiding sick and injured imigrants and properly taking care of the dead. Humane Borders is hoping to make people aware of the risks so that they can either choose not to go or be better prepared should they decide to cross.

In any case, it is an interesting example of how geographic information is still so important (and controversial) in our society.
Loading

0 comments on commit 04f83ce

Please sign in to comment.