AWS CloudWatch integration with Loggly
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.


This package provides a way to take facet searches run on your Loggly account and push the results to Amazon's CloudWatch service. The results can be used for alerting on certain types of events occuring in the logs on your servers.


Get started by installing the Python package 'hoover':

  sudo apt-get install python-setuptools
  sudo easy_install hoover

Download the package from Github by locally cloning the repository:

  cd ~/
  git clone


Next, you'll need to set your AWS key and private keys. One way to do this is by editing your .profile and exporting them to the environment:

  vim ~/.profile
  export AWS_ACCESS_KEY_ID=your_aws_key_goes_here
  export AWS_SECRET_ACCESS_KEY_ID=your_aws_secret_key_goes_here

Or, you can just edit the line in the file:

  cw = cloudwatch.connection('AWS_ACCESS_KEY_ID', 'AWS_SECRET_ACCESS_KEY_ID') 

You'll also need to set your Loggly credentials:

  hoover.authorize('ACCOUNT_NAME', 'USERNAME', 'PASSWORD')

and set your namespace for your site:

  namespace = 'Loggly'

Defining Searches

You'll need to configure a set of searches you want to run from Loggly. Change 'default' to the Loggly input you want to search.

The example below gets counts of events matching the wildcard search for a five minute window starting six minutes ago for the default input. Obviously this input needs to be receiving data from your web server to work.

  account_name = hoover.utils.get_input_by_name('default')
  num_results = account_name.facets(q='*', starttime='NOW-6MINUTES', endtime='NOW-1MINUTE', buckets=1)['data'].items()[0][1]


Run the code to test it.


Now set it up as a cron job:

  crontab -e
  */5 * * * * python ~/loggly-watch/

You should get custom data flowing into your CloudWatch account after about 5-10 minutes. Adjust your timeframes and cronjob as necessary, and add more searches for other use cases.


Here's a sample output from a run over 12 hours or so, tracking total web log entries on the website.

To graph the data on your own dashboard, use Micahel Babineau's CloudViz package.