Skip to content

Commit

Permalink
Updated README with useful info.
Browse files Browse the repository at this point in the history
  • Loading branch information
cgroskopf committed Jun 16, 2011
1 parent 40b642b commit 35c0dd7
Show file tree
Hide file tree
Showing 2 changed files with 67 additions and 24 deletions.
24 changes: 0 additions & 24 deletions README

This file was deleted.

67 changes: 67 additions & 0 deletions README.rst
@@ -0,0 +1,67 @@
census.ire.org
==============

A nationwide census browser for 2000 and 2010 census data.

Dependencies
============

You will need Python 2.7, the PostGIS stack, virtualenv and virtualenvwrapper. Mac Installation instructions at: http://blog.apps.chicagotribune.com/2010/02/17/quick-install-pythonpostgis-geo-stack-on-snow-leopard/):

Other required software:

* mongodb
* wget
* mdbtools.

On a Mac you can get these with Brew::

brew install mongodb
brew install wget
brew install mdbtools

Bootstrapping the webapp
========================

To get the web application running::

cd censusweb
mkvirtualenv --no-site-packages censusweb
pip install -r requirements.txt
cd ../censusweb
./manage.py runserver

Configuring the webapp
======================

By default the webapp is going to use the data published to the IRE test site, which may not be accessible to you. To use you're own data open censusweb/config/settings.py and modify the following line::

API_URL = 'http://s3.amazonaws.com/census-test'

See the next section to learn how to deploy data to your custom S3 bucket.

Loading data
============

Once you've setup the webapp you will have the requirements needed to load data. If you want to load embargoed data you will need to define environment variables for your username and password::

CENSUS_USER=cgroskopf@tribune.com
CENSUS_PASS=NotMyRealPassword

You will also need to have defined your Amazon Web Services credentials so that you can upload the rendered data files to S3::

export AWS_ACCESS_KEY_ID="foo"
export AWS_SECRET_ACCESS_KEY="bar"

You will also need to modify the load configuration to point at the same S3 bucket you configured for the webapp. Open dataprocessing/config.py and modify the following lines::

S3_BUCKETS = {
'staging': 'census-test',
'production': 'censusdata.ire.org',
}

To load SF1 data for Hawaii make sure you have Mongo running and then execute the following commands::

cd dataprocessing
./batch_sf.sh Hawaii staging

0 comments on commit 35c0dd7

Please sign in to comment.