Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

Making the name of the S3 backup directory configurable #17

Merged
merged 6 commits into from

2 participants

@strayduy

When the Mongo backups are uploaded to S3, they're stored under the "backups" directory in the configured S3 bucket (i.e. the path to each backup is BUCKET/backups/YYYY-MM-DD_hh-mm-ss.gz).

I updated the module to allow you to configure the name of the directory in S3, so you can set it to a value other than "backups."

The directory name is configurable through any of the following environment variables:

  • S3_BACKUP_DIR
  • S3_BACKUP_DIRNAME
  • S3_BACKUP_DIR_NAME

The MAX_BACKUPS setting will respect the configured directory name, so it will only remove backups that match the given directory prefix.

My use case is that I have a single S3 bucket with hourly, daily, and monthly backups, and I wanted the directory names to reflect which type of backup was in each directory. I also have a different MAX_BACKUPS value for each type of backup.

strayduy added some commits
@strayduy strayduy Allowing you to configure the name of the S3 backup directory through…
… an environment variable
9107a4b
@strayduy strayduy Fixing a bug with the Fog interface
The script was looking at all of the files in the specified bucket when determining which ones to delete. I only want it to look under the designated backup directory, so I added the directory prefix when it's considering which files to delete.
8d3c15e
@strayduy strayduy Correcting a comment 0a084a2
@alexkravets
Owner

Thanks a lot for this! Could you please add a passage to the README.md as well to keep documentation consistent, and add yourself to the end of the doc contributors section. Also bump gem version.

Thank you!

@strayduy

Updated the README and incremented the gem version.

@alexkravets alexkravets merged commit ffc4d37 into alexkravets:master
@alexkravets
Owner

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Jul 17, 2013
  1. @strayduy
Commits on Jul 18, 2013
  1. @strayduy

    Fixing a bug with the Fog interface

    strayduy authored
    The script was looking at all of the files in the specified bucket when determining which ones to delete. I only want it to look under the designated backup directory, so I added the directory prefix when it's considering which files to delete.
  2. @strayduy

    Correcting a comment

    strayduy authored
Commits on Aug 11, 2013
  1. @strayduy
  2. @strayduy
  3. @strayduy

    Bumping the version number

    strayduy authored
This page is out of date. Refresh to see the latest.
View
7 README.md
@@ -45,6 +45,11 @@ If you want to automatically remove old backup files pass ```MAX_BACKUPS``` para
* ```heroku run rake mongo:backup MAX_BACKUPS=7```
+If you're uploading to S3, backup files will be stored as ```backups/YYYY-MM-DD_hh-mm-ss.gz``` by default. To change the directory name, pass in the ```S3_BACKUP_DIR``` parameter:
+
+* ```heroku run rake mongo:backup S3_BACKUP_DIR=daily```
+* Backup files would then be stored as ```daily/backup-file-name.gz``` instead of ```backups/backup-file-name.gz```.
+
Restore from backup:
* ```heroku run rake mongo:restore FILE=backup-file-name.gz```
@@ -66,7 +71,7 @@ For Rails 2 add this to your Rakefile to import rake tasks:
5. [wolfpakz](https://github.com/wolfpakz "Dan Porter") - Rails2 support
6. [solacreative](http://sola-la.com/creative "Taro Murao") - Max backups feature for aws/s3 and s3 gems
7. [aarti](https://github.com/aarti "aarti") - minor fixes
-
+8. [strayduy](https://github.com/strayduy "strayduy") - [Configurable S3 directory name](https://github.com/alexkravets/heroku-mongo-backup/pull/17)
View
4 heroku-mongo-backup.gemspec
@@ -1,10 +1,10 @@
Gem::Specification.new do |s|
s.name = 'heroku-mongo-backup'
- s.version = '0.4.31'
+ s.version = '0.4.32'
s.summary = 'Rake task backups mongo database on Heroku and push gzipped file to Amazon S3 or FTP.'
s.description = 'Rake task for backing up mongo database on heroku and push it to S3 or FTP. Library can be used as rake task or be easily integrated into daily cron job.'
- s.authors = ['Alex Kravets', 'matyi', 'Stef Lewandowski', 'David Hall', 'Dan Porter', 'aarti']
+ s.authors = ['Alex Kravets', 'matyi', 'Stef Lewandowski', 'David Hall', 'Dan Porter', 'aarti', 'strayduy']
s.email = 'santyor@gmail.com'
s.homepage = 'https://github.com/alexkravets/heroku-mongo-backup'
View
18 lib/heroku-mongo-backup.rb
@@ -118,6 +118,18 @@ def s3_connect
bucket = ENV['S3_BUCKET']
end
+ dir_name = ENV['S3_BACKUP_DIR']
+ if dir_name.nil?
+ dir_name = ENV['S3_BACKUP_DIRNAME']
+ end
+ if dir_name.nil?
+ dir_name = ENV['S3_BACKUP_DIR_NAME']
+ end
+ if dir_name.nil?
+ dir_name = 'backups'
+ end
+ @dir_name = dir_name
+
access_key_id = ENV['S3_KEY_ID']
if access_key_id.nil?
access_key_id = ENV['S3_KEY']
@@ -138,12 +150,12 @@ def s3_connect
end
def s3_upload
- HerokuMongoBackup::s3_upload(@bucket, @file_name)
+ HerokuMongoBackup::s3_upload(@bucket, @dir_name, @file_name)
end
def s3_download
open(@file_name, 'w') do |file|
- file_content = HerokuMongoBackup::s3_download(@bucket, @file_name)
+ file_content = HerokuMongoBackup::s3_download(@bucket, @dir_name, @file_name)
file.binmode
file.write file_content
end
@@ -215,7 +227,7 @@ def backup files_number_to_leave=0
end
if files_number_to_leave > 0
- HerokuMongoBackup::remove_old_backup_files(@bucket, files_number_to_leave)
+ HerokuMongoBackup::remove_old_backup_files(@bucket, @dir_name, files_number_to_leave)
end
end
View
42 lib/s3_helpers.rb
@@ -20,14 +20,14 @@ def HerokuMongoBackup::s3_connect(bucket, key, secret)
return bucket
end
- def HerokuMongoBackup::s3_upload(bucket, filename)
- object = bucket.objects.build("backups/#{filename}")
+ def HerokuMongoBackup::s3_upload(bucket, dirname, filename)
+ object = bucket.objects.build("#{dirname}/#{filename}")
object.content = open(filename)
object.save
end
- def HerokuMongoBackup::s3_download(bucket, filename)
- object = bucket.objects.find("backups/#{filename}")
+ def HerokuMongoBackup::s3_download(bucket, dirname, filename)
+ object = bucket.objects.find("#{dirname}/#{filename}")
content = object.content(reload=true)
puts "Backup file:"
@@ -39,8 +39,8 @@ def HerokuMongoBackup::s3_download(bucket, filename)
return content
end
- def HerokuMongoBackup::remove_old_backup_files(bucket, files_number_to_leave)
- excess = ( object_keys = bucket.objects.find_all(:prefix => "backups/").map { |o| o.key }.sort ).count - files_number_to_leave
+ def HerokuMongoBackup::remove_old_backup_files(bucket, dirname, files_number_to_leave)
+ excess = ( object_keys = bucket.objects.find_all(:prefix => "#{dirname}/").map { |o| o.key }.sort ).count - files_number_to_leave
(0..excess-1).each { |i| bucket.objects.find(object_keys[i]).destroy } if excess > 0
end
@@ -68,17 +68,17 @@ def HerokuMongoBackup::s3_connect(bucket, key, secret)
return bucket
end
- def HerokuMongoBackup::s3_upload(bucket, filename)
- AWS::S3::S3Object.store("backups/#{filename}", open(filename), bucket)
+ def HerokuMongoBackup::s3_upload(bucket, dirname, filename)
+ AWS::S3::S3Object.store("#{dirname}/#{filename}", open(filename), bucket)
end
- def HerokuMongoBackup::s3_download(bucket, filename)
- content = AWS::S3::S3Object.value("backups/#{filename}", bucket)
+ def HerokuMongoBackup::s3_download(bucket, dirname, filename)
+ content = AWS::S3::S3Object.value("#{dirname}/#{filename}", bucket)
return content
end
- def HerokuMongoBackup::remove_old_backup_files(bucket, files_number_to_leave)
- excess = ( object_keys = AWS::S3::Bucket.find(bucket).objects(:prefix => 'backups/').map { |o| o.key }.sort ).count - files_number_to_leave
+ def HerokuMongoBackup::remove_old_backup_files(bucket, dirname, files_number_to_leave)
+ excess = ( object_keys = AWS::S3::Bucket.find(bucket).objects(:prefix => "#{dirname}/").map { |o| o.key }.sort ).count - files_number_to_leave
(0..excess-1).each { |i| AWS::S3::S3Object.find(object_keys[i], bucket).delete } if excess > 0
end
@@ -98,9 +98,9 @@ def HerokuMongoBackup::remove_old_backup_files(bucket, files_number_to_leave)
if defined?(Fog)
#
- # Using 'aws/s3' gem as Amazon S3 interface
+ # Using 'fog' gem as Amazon S3 interface
#
- #puts "Using \'aws/s3\' gem as Amazon S3 interface."
+ #puts "Using \'fog\' gem as Amazon S3 interface."
def HerokuMongoBackup::s3_connect(bucket, key, secret)
connection = Fog::Storage.new({
:provider => 'AWS',
@@ -111,24 +111,24 @@ def HerokuMongoBackup::s3_connect(bucket, key, secret)
return directory
end
- def HerokuMongoBackup::s3_upload(directory, filename)
+ def HerokuMongoBackup::s3_upload(directory, dirname, filename)
file = directory.files.create(
- :key => "backups/#{filename}",
+ :key => "#{dirname}/#{filename}",
:body => open(filename)
)
end
- def HerokuMongoBackup::s3_download(directory, filename)
- file = directory.files.get("backups/#{filename}")
+ def HerokuMongoBackup::s3_download(directory, dirname, filename)
+ file = directory.files.get("#{dirname}/#{filename}")
return file.body
end
- def HerokuMongoBackup::remove_old_backup_files(directory, files_number_to_leave)
- total_backups = directory.files.all.size
+ def HerokuMongoBackup::remove_old_backup_files(directory, dirname, files_number_to_leave)
+ total_backups = directory.files.all({:prefix => "#{dirname}/"}).size
if total_backups > files_number_to_leave
- files_to_destroy = (0..total_backups-files_number_to_leave-1).collect{|i| directory.files.all[i] }
+ files_to_destroy = (0..total_backups-files_number_to_leave-1).collect{|i| directory.files.all({:prefix => "#{dirname}/"})[i] }
files_to_destroy.each do |f|
f.destroy
Something went wrong with that request. Please try again.