Skip to content


Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Aggregates accidents reports, planned construction, and lane closures into one feed.
Python JavaScript Ruby Shell
branch: master

This branch is 2 commits behind tulsawebdevs:master

Fetching latest commit…

Cannot retrieve the latest commit at this time

Failed to load latest commit information.

Tulsa Road Issues Feed

The Tulsa Road Issues Feed (TRIF) aggregates road information from several sources into a single feed. This can be used by media outlets to report issues to their audience, or as the raw data for interactive web projects.

This project is part of Hackathon 2011. See our github project for up-to-date information.

The code is running at Developers might be interested in the about page, which has some background info and describes the API for getting traffic data. The API is used by

Get the code

Grab the source from Github

git clone git://

Install the Packages

Create a virtualenv and install required libraries

cd tulsa-road-issues-feed
python ./
python ./ .

Set up local environment

Copy the distributed local settings

cp trif/ trif/

Activate the virtualenv

source bin/activate

Initialize the (sqlite) database

python trif/ syncdb
python trif/ migrate

Run it

Activate the virtualenv

source bin/activate

Fetch feed data

python trif/ fetch_feeds

Run the django dev server

python trif/ runserver


The code on the server runs from /home/trif/trif. Developers in the trif group can pull changes from the github repo. supervisor is used to run the project, and nginx as the HTTP server.

To fetch the feeds, a crontab task runs a script to run fetch_feeds (see scripts/ for the script). The script creates /tmp/trif_update_feeds, with the current date, and deletes it on completion. This is used to determine if the script is already running, so that we don't run it again if something is going wrong (usually a problem on the City of Tulsa sites). If something does go wrong, you may need to log in, manually delete this file, and kill the stuck processes. The contents of the file (and /tmp/last_feed_run.txt) will tell you how long the process was stuck.

Something went wrong with that request. Please try again.