Software to demonstrate collecting bike data from metro busses
Switch branches/tags
Nothing to show
Clone or download
Pull request Compare This branch is 31 commits ahead, 2 commits behind hackathon-in-a-box:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.


Low cost sensing of bike rack use on buses

King County Metro and Sound Transit buses provide bicycle racks on the front of each vehicle. Each rack holds 2-3 bikes. At certain stops, the bike rack use is so popular that the rack space is full and cyclists have to wait for the next bus, if there's space.

This project's solves two problems. First, the agencies have no quantitative data about how often and how much the bike racks are used. Second, citizens wanting to use the bike racks benefit from having advance warning if the bike rack on their soon-to-arrive bus is full. With this information, commuters can make better informed decisions such as going to another stop, waiting and getting coffee, or riding a different route.

To sense bike rack use, one could deploy sensors on every bus, and relay the data back to some central system. However, this is very costly. Our solution places inexpensive hardware on top of bus stop signs and/or shelters. The system has a sonar sensor that knows when the bus approaches. This triggers an Android phone to take a set of pictures which are relayed to the cloud. Shortly afterward, the pictures are sent to crowd workers who are paid a few cents to label key information: route number, vehicle number, and how many bikes are on the bus. This schematized data is then relayed back to the cloud, where it can be combined with real time arrival apps like OneBusAway or analyzed offline for agency reporting.

The crowd work component of the system is intended to generate data that can be used for machine learning (AI) that will be able to process the image and compute the data on its own without human intervention.

Diagram showing parts of the project

You can see a video of us recording data on 3rd Avenue. The device worked, producing photos like this that were presented to crowd workers with an interface like the following:

Screenshot of crowd worker view

This application was developed during March 20-22 for the Hack the Commute.

Team Members

Our team is comprised of:

Standing on the shoulders of giants

There are several "moving parts" in this hack: hardware and platforms that enable our solution:

  • Maxbotix MB7060 outdoor high precision ultrasonic rangefinder
  • Intel Edison integrated system used for prototyping electronics hardware
  • Microsoft Azure cloud platform, used to store images and host our Web site
  • Crowdflower microtasking platform, where crowd workers label the images taken by the system
  • OpenCV computer vision libraries used for feature recognition
  • AlchemyAPI AI resource used for image recognition