Skip to content
This repository


Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Tweet Location Demo

branch: master

Tweet Location Demo

Stores incoming tweets from the twitter streaming api into mongo and allows you to search them and displays them on a map like so:


Fast Start with Tests

Tests run and hopefully show (12 Examples, 0 Failures)

git clone git:// && cd rails-demo/ && bundle install && rake spec && rails server

Detailed Setup

  1. git clone git://
  2. bundle install
  3. replace config/settings.yml with your twitter api account or your actual twitter username and password will work
  4. rails server

The daemon is started in the UI but, if it doesn’t for some reason here is how you can start or stop it manually.

./lib/daemons/tweet_zilla_ctl start

./lib/daemons/tweet_zilla_ctl stop

Or to debug:

cd lib/daemons && ruby tweet_zilla.rb


  • I friggin love that you guys asked for a code sample!
  • I used a rails template from: for a base with mongoid, rspec, twitter bootstrap, etc.
  • Guard is so friggin cool. Seriously? Awesome, I love that it runs my specs after every change!! It’s like CI for the programmer! Hahahaha, wow. The .NET community sure could use this…
  • I built this from the programmers perspective, being able to see the number of tweets being fed into the db in real time is super nice, had I spent more time I would have displayed/logged more detailed information from the daemon.
  • In googling to find out more of using ruby fibers / event machine with mongo I ran into a fellow applicants codebase: He used TweetStream like I did which uses EventMachine to handle incoming tweets. Unlike him I wrapped it all in EM.synchrony, which is supposed to handle mongoid connections synchronously, really confused by that, I would think it kind of defeats the purpose of having async calls, I really would like to spend more time to grasp that better.
  • The bottleneck for the daemon appears to be the streaming api itself limiting tweets because of result filtering using location:
The the track parameter (keywords), and the location parameter (geo) on the statuses/filter method are rate-limited predicates.  

From my findings it would take even longer to get tweets using the sample search method, even though it’s not as limited, it doesn’t have what we need.

Benchmarks for gathering 500 Tweets

Client side filtering for geo coordinates using the sample search method – writing to database:

real  17m21.290s
user  5m47.766s
sys 0m10.857s

(I ran this benchmark only once because it took so long to complete.)

Location filter of World Bounding Box – writing to database:

rate limit met
rate limit met
rate limit met
rate limit met
rate limit met

Location filter of USA Bounding Box – not writing to database:

real 1m7.954s
user 0m23.969s
sys 0m1.160s

real 0m58.481s
user 0m23.285s
sys 0m1.492s

Location filter of USA Bounding Box – writing to database:

real 1m0.732s
user 0m25.370s
sys 0m1.132s

real 0m58.039s
user 0m25.042s
sys 0m1.320s

Location filter of USA Bounding Box – writing to database – with EM.synchrony:

real 0m51.437s
user 0m25.074s
sys 0m1.344s

real 0m57.831s
user 0m25.578s
sys 0m1.312s


Obviously these benchmarks are just for fun, there are too many factors that fudge them. The main ones being: ruby start up time, the internet and twitter streaming api response time! Just like the page rendered time and tweets per second that display on the site, they are not super accurate, yet.


  • Finish test coverage.
  • Clean up/refactor codebase to team’s coding standards. For Example: We only use the new ruby 1.9 hash syntax!!! Okay: “okay”, i_get_it: “;-)”
  • Tighten up site security
  • Move daemon control to, and create Admin Area
  • Javascript testing via framework like QUNit
  • Add in support for “Place” on twitter status streaming api by geocoding address or by grabbing a coordinate pair from list.
  • Move capped collection to config
  • Better Deamon monitoring, error handling and logging
  • Real time tweet updates on map page
  • Performance enhacements – blanket term I know, but I am sure there are things to speed up.
  • AJAXIFY THE DANG MAP & SEARCH!. I did ghetto inline javascript rendering… sad…

I can now recreate this project twice as fast, and know that:

  • Draper decorators don’t work well with mongoid documents, for some reason…
  • mongoid_spacial(which is the new mongoid-geo and yes it is mispelled) returns distance!!! Model.geo_near().geo[:disatnce]!!!
  • daemon_generator for rails 3 works
  • Twitter API Streaming is pretty sweet.
  • TweetStream gem is a wrapper over twitter-stream … and you can stop the machine by calling the TweetStream::Client#stop method.
  • Daemons are fun
  • Rails is sweet
  • Guard freaking rocks the house, I would love to speed up the test and rails startup time though, maybe with spork?.
  • Now that I know how to write tests for rails, I would start out that way on the next proj
Something went wrong with that request. Please try again.