Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

ec2-backup-to-ami (python dependancies sample) #78

Merged
merged 2 commits into from

2 participants

@mrdavidlaing

This sample illustrates how to use Python dependancies via pip, as described by Paddy Foran at

http://stackoverflow.com/questions/13285901/how-to-bundle-python-dependancies-in-ironworker/13296163#13296163

...n/ec2-backup-to-ami_python-dependancies/NOTES.private
@@ -0,0 +1,9 @@
+#sudoventures backup-agent
+iron_worker queue ec2-backup-to-ami --payload "export AWS_ACCESS_KEY_ID=AKIAIFYKD3RALZG7ASCA && export AWS_SECRET_ACCESS_KEY=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"

These AWS keys are no longer valid :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@treeder
Owner

Thanks!

@treeder treeder merged commit 872f0c1 into from
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Nov 9, 2012
  1. @mrdavidlaing

    ec2-backup-to-ami sample

    mrdavidlaing authored
  2. @mrdavidlaing

    Remove .private files!

    mrdavidlaing authored
This page is out of date. Refresh to see the latest.
View
1  python/ec2-backup-to-ami_python-dependancies/.gitignore
@@ -0,0 +1 @@
+.private
View
24 python/ec2-backup-to-ami_python-dependancies/README.md
@@ -0,0 +1,24 @@
+# EC2 backup to AMI - a sample showing how to use python dependacies via pip
+
+This sample shows how to use a Python script that relies on 3rd party libraries installed via pip.
+
+Its based on the advice given at: http://stackoverflow.com/questions/13285901/how-to-bundle-python-dependancies-in-ironworker
+
+## To run
+
+_NB! These scripts will create AMIs of all running instances in your EC2 account (across all regions). This will reboot all your instances!_
+
+1. Be sure you've setup your Iron.io credentials, see main [README.md](https://github.com/iron-io/iron_worker_examples).
+1. Run ```iron_worker upload ec2-backup-to-ami``` to upload the worker code package to IronWorker.
+1. Queue up a task:
+ ```iron_worker queue ec2-backup-to-ami --payload "export AWS_ACCESS_KEY_ID=xxxxx && export AWS_SECRET_ACCESS_KEY=xxxx"```
+1. Look at [HUD](https://hud.iron.io) to view your tasks running, check logs, etc.
+
+## How it works
+
+1. Unlike with a Ruby worker, there is no build in functionality to manage 3rd party Python dependancies
+1. However, by making your worker of type "binary", you can run a build step that does a pip installation to a local folder before your worker code runs
+1. Then, launch your python exec via a shell script, which can setup the required paths and env vars
+1. Note how the script uses the -payload param to "source" some additional environment variables into the Python script's run environment
+1. The Python scripts in question are actually installed as part of the [botocross](http://github.com/sopel/botocross) library
+
View
25 python/ec2-backup-to-ami_python-dependancies/ec2-backup-to-ami.sh
@@ -0,0 +1,25 @@
+#!/usr/bin/env bash
+export PYTHONPATH="$HOME/pips/lib/python2.7/site-packages:$PYTHONPATH"
+export PATH="$HOME/pips/bin:$PATH"
+
+#Extract the payload file location
+while [ $# -gt 1 ]; do
+ if [ "$1" = "-payload" ]; then
+ PAYLOAD_FILE="$2"
+ break
+ fi
+
+ shift
+done
+
+#Run the payload file to load the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables
+source $PAYLOAD_FILE
+
+echo "Validating credentials"
+validate-credentials.py
+echo "Describe instances"
+describe-instances.py -f instance-state-name=running -f tag-key=Name
+echo "Create images"
+create-images.py -f instance-state-name=running -f tag-key=Name
+echo "Delete old AMIs"
+delete-images.py --backup_retention 30 -f tag-key=Name
View
7 python/ec2-backup-to-ami_python-dependancies/ec2-backup-to-ami.worker
@@ -0,0 +1,7 @@
+runtime "binary"
+
+build 'pip install --install-option="--prefix=`pwd`/pips" git+https://github.com/sopel/botocross.git'
+
+file "ec2-backup-to-ami.sh"
+exec "ec2-backup-to-ami.sh"
+
Something went wrong with that request. Please try again.