Skip to content

ambroseus/kirby

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

63 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

kirby

Kirby slurps up the firehose of logs from Fastly and calculates daily counts for various Ruby ecosystem statistics, pretty quickly.

How fast is pretty quickly?

For an 80MB gzipped log file containing 915,427 JSON event objects (which is 1.02GB uncompressed):

  • 2.7 seconds total to read the entire file line by line
  • 5.0 seconds total to also parse every JSON object into a Rust struct
  • 7.8 seconds total to further parse every User Agent field for Bundler, RubyGems, and Ruby versions and other metrics

This is... very good. For comparison, a Python script that used AWS Glue to do something similar took about 30 minutes. My first approach of writing a nom parser-combinator to parse the User Agent field, instead of using a regex, took 18.7 seconds. Processing a gigabyte of almost a million JSON objects into useful histograms in less than 8 seconds just blows my mind. But then I figured out how to use Rayon, and now if you give it 8 gzipped log files on an 8-core MacBook Pro, it can parse 399,300 JSON objects per second.

Wait, how fast?

   ~525 records/second/cpu in Python on AWS Glue
 50,534 records/second/cpu in Rust with nom
121,153 records/second/cpu in Rust with regex

Are you kidding me?

No. It gets even better if you have multiple cores.

 ~4,200 records/second in Python with 8 worker instances on AWS Glue
399,300 records/second in Rust with 8 cores and rayon on a MacBook Pro

What does it calculate?

It counts Bundler, RubyGems, and Ruby versions, in hourly buckets, and prints those out as nested JSON to stdout.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Rust 46.6%
  • Python 28.9%
  • VCL 11.2%
  • Shell 6.8%
  • Ruby 6.5%