Browse files

Added initial version of the script, README file

It works. Basic functionality, but it works for what I need. There's a README that's a basic intro to the tool.
  • Loading branch information...
1 parent 8749814 commit f147290528e8ce9ec1f8536b5e2d2952b1ad458d @schallert committed Apr 16, 2012
Showing with 92 additions and 0 deletions.
  1. +18 −0 README.md
  2. +74 −0 quickS3Backup.py
View
18 README.md
@@ -0,0 +1,18 @@
+# PyQuickS3 #
+### A small helper utility for quickly sending a file to Amazon S3 ###
+
+Often, I'll be working with some files and I'm about to make a change to one of them. I may be about to screw something up, so it's nice to have an old copy of it laying around somewhere. I don't want to just `cp old_file old_file.old`, since that clutters up whatever folder I'm working in. And if I don't end up screwing up the original then it just makes my life inconvenient, since I have a folder with unnecessary backups.
+
+Enter **PyQuickS3**. PyQuickS3 lets you fire off a file to S3 without worrying about it. Say you're about to change `important.config`, a quick command line call to `pyqs3 important.config` will send the file to an S3 bucket that you predefine, so you know it's there if you need it. If you want to store your files under a certain folder, say `configs` in S3, then that's easy too. Use the `-p` option (prefix) to denote a folder to store your file under. So `pysq3 -p configs important.config` will put the `important.config` file in the `configs` folder in your bucket.
+
+It's a simple handy utility for now, with more features to come (see the [wiki](wiki) for the official TODO).
+
+Clone it, try it, fork it, push it. You know the drill. Enjoy.
+
+### Usage ###
+1. Copy `config.sample.json` to `config.json` and supply needed variables.
+2. Make sure `boto` is installed
+3. Profit?
+
+### Note ###
+In the intro above, a call was made to `pyqs3`. Right now, there is no prepackaged executable, since many people will configure this package different (mainly because it requires an external library, `boto`). For now you have to call `python quickS3Backup.py ...`. Although this isn't *quick* as the name suggests, it's not to hard to root install `boto` and move the `.py` file to an executable. Do as you please.
View
74 quickS3Backup.py
@@ -0,0 +1,74 @@
+import sys, os
+try:
+ import simplejson as json
+except:
+ import json
+from boto.s3.connection import S3Connection
+from boto.s3.key import Key
+from boto.s3.bucket import Bucket
+
+# Configure S3 authentication
+config = json.load(open('config.json'))
+key_id = config['aws']['AWS_ACCESS_KEY_ID']
+secret_key = config['aws']['AWS_SECRET_ACCESS_KEY']
+db_bucket_name = config['aws']['BACKUP_BUCKET']
+
+# Function to check that we are given input files
+def checkFilesExist(sysarg_array):
+ if len(sysarg_array) > 0:
+ return sysarg_array
+ else:
+ sys.exit('File names not provided')
+
+# Construct the prefix (i.e. S3 folder) for the key, and input files
+# from command line accordingly
+if sys.argv[1] == '-p':
+ folder_prefix = sys.argv[2]
+ input_files = checkFilesExist(sys.argv[3:])
+else:
+ folder_prefix = None
+ input_files = checkFilesExist(sys.argv[1:])
+
+# Make sure we are given files that actually exist
+for f in input_files:
+ try:
+ open(f)
+ except IOError:
+ sys.exit('Could not access file: ' + f)
+
+# Create the S3 connection
+s3conn = S3Connection(key_id, secret_key)
+
+# Check if there's already a bucket for the desired bucket name
+if not db_bucket_name in [b.name for b in s3conn.get_all_buckets()]:
+ db_bucket = s3conn.create_bucket(db_bucket_name)
+else:
+ db_bucket = s3conn.get_bucket(db_bucket_name)
+
+# Callback function to print percentage
+# It's super janky, I know. I'll work on it
+def cbf(trans, total):
+ percent = int((float(trans)/total) * 100)
+ if percent == 0:
+ sys.stdout.write('[')
+ for i in xrange(50):
+ sys.stdout.write(' ')
+ sys.stdout.write(']\n ')
+ sys.stdout.flush()
+ elif (trans/total) == 1:
+ print '='
+ else:
+ sys.stdout.write('=')
+ sys.stdout.flush()
+
+# Send the files to the server
+for file_to_send in input_files:
+ file_path = os.path.join(os.getcwd(), file_to_send)
+ key = Key(db_bucket)
+ # Only append the actual filename, we dont want to send the whole path
+ file_name = os.path.basename(file_to_send)
+ # Prepend the folder prefix we defined if necessary
+ key.key = folder_prefix + '/' + file_name if folder_prefix else file_name
+ # Actually send the file, set our callback and number of times to call it
+ key.set_contents_from_filename(file_path, cb=cbf, num_cb=50)
+ key.close()

0 comments on commit f147290

Please sign in to comment.