Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

Merge branch 'develop'

  • Loading branch information...
commit 269b8b453bcca1a4e4645552168b170a18d2e994 2 parents a86eb33 + 5730a53
@meskyanichi meskyanichi authored
View
2  Gemfile.lock
@@ -1,7 +1,7 @@
PATH
remote: .
specs:
- backup (3.0.6)
+ backup (3.0.7)
dropbox (~> 1.2.3)
fog (~> 0.5.3)
mail (~> 2.2.15)
View
4 README.md
@@ -1,7 +1,7 @@
Backup 3
========
-Backup is a RubyGem (for UNIX-like operating systems: Linux, Mac OSX) that allows you to configure and perform backups in a simple manner using an elegant Ruby DSL. It supports various databases (MySQL, PostgreSQL, MongoDB and Redis), it supports various storage locations (Amazon S3, Rackspace Cloud Files, Dropbox, any remote server through FTP, SFTP, SCP and RSync), it can archive files and directories, it can cycle backups, it can do incremental backups, it can compress backups, it can encrypt backups (OpenSSL or GPG), it can notify you about successful and/or failed backups (Email or Twitter). It is very extensible and easy to add new functionality to. It's easy to use.
+Backup is a RubyGem (for UNIX-like operating systems: Linux, Mac OSX) that allows you to configure and perform backups in a simple manner using an elegant Ruby DSL. It supports various databases (MySQL, PostgreSQL, MongoDB and Redis), it supports various storage locations (Amazon S3, Rackspace Cloud Files, Dropbox, any remote server through FTP, SFTP, SCP and RSync), it provide Syncers (RSync, S3) for efficient backups, it can archive files and directories, it can cycle backups, it can do incremental backups, it can compress backups, it can encrypt backups (OpenSSL or GPG), it can notify you about successful and/or failed backups (Email or Twitter). It is very extensible and easy to add new functionality to. It's easy to use.
Author
------
@@ -248,7 +248,7 @@ Contributors
</tr>
<tr>
<td><a href="https://github.com/asanghi" target="_blank">Aditya Sanghi ( asanghi )</a></td>
- <td>Twitter Notifier</td>
+ <td>Twitter Notifier, Dropbox Timeout Configuration</td>
</tr>
<tr>
<td><a href="https://github.com/phlipper" target="_blank">Phil Cohen ( phlipper )</a></td>
View
15 backup.gemspec
@@ -12,12 +12,15 @@ Gem::Specification.new do |gem|
gem.authors = 'Michael van Rooijen'
gem.email = 'meskyanichi@gmail.com'
gem.homepage = 'http://rubygems.org/gems/backup'
- gem.summary = 'Backup is a RubyGem (for UNIX-like operating systems: Linux, Mac OSX) that allows you to configure and perform backups in a simple manner using an elegant Ruby DSL.'
- gem.description = 'Backup is a RubyGem (for UNIX-like operating systems: Linux, Mac OSX) that allows you to configure and perform backups in a simple manner using an elegant Ruby DSL.
- It supports various databases (MySQL, PostgreSQL, MongoDB and Redis), it supports various storage locations
- (Amazon S3, Rackspace Cloud Files, Dropbox, any remote server through FTP, SFTP, SCP and RSync), it can archive files and directories,
- it can cycle backups, it can do incremental backups, it can compress backups, it can encrypt backups (OpenSSL or GPG),
- it can notify you about successful and/or failed backups (Email, Twitter). It is very extensible and easy to add new functionality to. It\'s easy to use.'
+ gem.summary = 'Backup is a RubyGem (for UNIX-like operating systems: Linux, Mac OSX)
+ that allows you to configure and perform backups in a simple manner using
+ an elegant Ruby DSL. It supports various databases (MySQL, PostgreSQL, MongoDB and Redis),
+ it supports various storage locations (Amazon S3, Rackspace Cloud Files, Dropbox, any remote
+ server through FTP, SFTP, SCP and RSync), it provide Syncers (RSync, S3) for efficient backups,
+ it can archive files and directories, it can cycle backups, it can do incremental backups, it
+ can compress backups, it can encrypt backups (OpenSSL or GPG), it can notify you about
+ successful and/or failed backups (Email or Twitter). It is very extensible and easy to add new
+ functionality to. It\'s easy to use.'
##
# Files and folder that need to be compiled in to the Ruby Gem
View
74 bin/backup
@@ -31,20 +31,18 @@ class BackupCLI < Thor
# Performs the backup process. The only required option is the --trigger [-t].
# If the other options (--config_file, --data_path, --tmp_path) aren't specified
# it'll fallback to the (good) defaults
- method_option :trigger, :type => :string, :aliases => '-t', :required => true
+ method_option :trigger, :type => :string, :aliases => ['-t', '--triggers'], :required => true
method_option :config_file, :type => :string, :aliases => '-c'
method_option :data_path, :type => :string, :aliases => '-d'
method_option :log_path, :type => :string, :aliases => '-l'
method_option :tmp_path, :type => :string
- desc 'perform', 'Performs the backup for the specified trigger'
+ desc 'perform', "Performs the backup for the specified trigger.\n" +
+ "You may perform multiple backups by providing multiple triggers, separated by commas.\n\n" +
+ "Example:\n\s\s$ backup perform --triggers backup1,backup2,backup3,backup4\n\n" +
+ "This will invoke 4 backups, and they will run in the order specified (not asynchronous)."
def perform
##
- # Defines the TRIGGER and TIME constants
- Backup.send(:const_set, :TRIGGER, options[:trigger])
- Backup.send(:const_set, :TIME, Time.now.strftime("%Y.%m.%d.%H.%M.%S"))
-
- ##
# Overwrites the CONFIG_FILE location, if --config-file was specified
if options[:config_file]
Backup.send(:remove_const, :CONFIG_FILE)
@@ -73,24 +71,58 @@ class BackupCLI < Thor
end
##
- # Ensure the TMP_PATH, LOG_PATH, DATA_PATH and DATA_PATH/TRIGGER
- # are created if they do not yet exist
- Array.new([
- Backup::TMP_PATH,
- Backup::LOG_PATH,
- File.join(Backup::DATA_PATH, Backup::TRIGGER)
- ]).each do |path|
+ # Ensure the TMP_PATH and LOG_PATH are created if they do not yet exist
+ Array.new([Backup::TMP_PATH, Backup::LOG_PATH]).each do |path|
FileUtils.mkdir_p(path)
end
##
- # Parses the backup configuration file and returns the model instance by trigger
- model = Backup::Finder.new(Backup::TRIGGER, Backup::CONFIG_FILE).find
-
- ##
- # Runs the returned model
- Backup::Logger.message "Performing backup for #{model.label}!"
- model.perform!
+ # Process each trigger
+ options[:trigger].split(",").map(&:strip).each do |trigger|
+
+ ##
+ # Defines the TRIGGER constant
+ Backup.send(:const_set, :TRIGGER, trigger)
+
+ ##
+ # Define the TIME constants
+ Backup.send(:const_set, :TIME, Time.now.strftime("%Y.%m.%d.%H.%M.%S"))
+
+ ##
+ # Ensure DATA_PATH and DATA_PATH/TRIGGER are created if they do not yet exist
+ FileUtils.mkdir_p(File.join(Backup::DATA_PATH, Backup::TRIGGER))
+
+ ##
+ # Parses the backup configuration file and returns the model instance by trigger
+ model = Backup::Finder.new(trigger, Backup::CONFIG_FILE).find
+
+ ##
+ # Runs the returned model
+ Backup::Logger.message "Performing backup for #{model.label}!"
+ model.perform!
+
+ ##
+ # Removes the TRIGGER constant
+ Backup.send(:remove_const, :TRIGGER) if defined? Backup::TRIGGER
+
+ ##
+ # Removes the TIME constant
+ Backup.send(:remove_const, :TIME) if defined? Backup::TIME
+
+ ##
+ # Reset the Backup::Model.current to nil for the next potential run
+ Backup::Model.current = nil
+
+ ##
+ # Reset the Backup::Model.all to an empty array since this will be
+ # re-filled during the next Backup::Finder.new(arg1, arg2).find
+ Backup::Model.all = Array.new
+
+ ##
+ # Reset the Backup::Model.extension to 'tar' so it's at it's
+ # initial state when the next Backup::Model initializes
+ Backup::Model.extension = 'tar'
+ end
end
##
View
5 lib/backup.rb
@@ -20,7 +20,7 @@ module Backup
STORAGES = ['S3', 'CloudFiles', 'Dropbox', 'FTP', 'SFTP', 'SCP', 'RSync']
COMPRESSORS = ['Gzip']
ENCRYPTORS = ['OpenSSL', 'GPG']
- SYNCERS = ['RSync']
+ SYNCERS = ['RSync', 'S3']
NOTIFIERS = ['Mail', 'Twitter']
##
@@ -87,6 +87,7 @@ module Storage
module Syncer
autoload :RSync, File.join(CONFIGURATION_PATH, 'syncer', 'rsync')
+ autoload :S3, File.join(CONFIGURATION_PATH, 'syncer', 's3')
end
module Database
@@ -115,7 +116,9 @@ module Storage
##
# Autoload Backup syncer files
module Syncer
+ autoload :Base, File.join(SYNCER_PATH, 'base')
autoload :RSync, File.join(SYNCER_PATH, 'rsync')
+ autoload :S3, File.join(SYNCER_PATH, 's3')
end
##
View
4 lib/backup/configuration/storage/dropbox.rb
@@ -18,6 +18,10 @@ class << self
# Path to where the backups will be stored
attr_accessor :path
+ ##
+ # Dropbox connection timeout
+ attr_accessor :timeout
+
end
end
end
View
29 lib/backup/configuration/syncer/s3.rb
@@ -0,0 +1,29 @@
+# encoding: utf-8
+
+module Backup
+ module Configuration
+ module Syncer
+ class S3 < Base
+ class << self
+
+ ##
+ # Amazon Simple Storage Service (S3) Credentials
+ attr_accessor :access_key_id, :secret_access_key
+
+ ##
+ # Amazon S3 bucket name and path to sync to
+ attr_accessor :bucket, :path
+
+ ##
+ # Directories to sync
+ attr_accessor :directories
+
+ ##
+ # Flag to enable mirroring
+ attr_accessor :mirror
+
+ end
+ end
+ end
+ end
+end
View
7 lib/backup/storage/dropbox.rb
@@ -21,6 +21,10 @@ class Dropbox < Base
attr_accessor :path
##
+ # Dropbox connection timeout
+ attr_accessor :timeout
+
+ ##
# Creates a new instance of the Dropbox storage object
# First it sets the defaults (if any exist) and then evaluates
# the configuration block which may overwrite these defaults
@@ -31,6 +35,7 @@ def initialize(&block)
instance_eval(&block) if block_given?
+ @timeout ||= 300
@time = TIME
end
@@ -68,7 +73,7 @@ def connection
# Transfers the archived file to the specified Dropbox folder
def transfer!
Logger.message("#{ self.class } started transferring \"#{ remote_file }\".")
- connection.upload(File.join(local_path, local_file), remote_path, :timeout => 300)
+ connection.upload(File.join(local_path, local_file), remote_path, :timeout => timeout)
end
##
View
10 lib/backup/syncer/base.rb
@@ -0,0 +1,10 @@
+# encoding: utf-8
+
+module Backup
+ module Syncer
+ class Base
+ include Backup::CLI
+ include Backup::Configuration::Helpers
+ end
+ end
+end
View
14 lib/backup/syncer/rsync.rb
@@ -2,9 +2,7 @@
module Backup
module Syncer
- class RSync
- include Backup::CLI
- include Backup::Configuration::Helpers
+ class RSync < Base
##
# Server credentials
@@ -73,25 +71,25 @@ def options
end
##
- # Returns Rsync syntax for specifying the --delete option (to enable mirroring)
+ # Returns Rsync syntax for enabling mirroring
def mirror
'--delete' if @mirror
end
##
- # Returns Rsync syntax for specifying the the 'compress' option
+ # Returns Rsync syntax for compressing the file transfers
def compress
- '-z' if @compress
+ '--compress' if @compress
end
##
- # Returns Rsync syntax for specifying the --archive option
+ # Returns Rsync syntax for invoking "archive" mode
def archive
'--archive'
end
##
- # Returns Rsync syntax for specifying a port to connect to
+ # Returns Rsync syntax for defining a port to connect to
def port
"--port='#{@port}'"
end
View
111 lib/backup/syncer/s3.rb
@@ -0,0 +1,111 @@
+# encoding: utf-8
+
+module Backup
+ module Syncer
+ class S3 < Base
+
+ ##
+ # Amazon Simple Storage Service (S3) Credentials
+ attr_accessor :access_key_id, :secret_access_key
+
+ ##
+ # Amazon S3 bucket name and path to sync to
+ attr_accessor :bucket, :path
+
+ ##
+ # Directories to sync
+ attr_accessor :directories
+
+ ##
+ # Flag to enable mirroring
+ attr_accessor :mirror
+
+ ##
+ # Instantiates a new S3 Syncer object and sets the default configuration
+ # specified in the Backup::Configuration::Syncer::S3. Then it sets the object
+ # defaults if particular properties weren't set. Finally it'll evaluate the users
+ # configuration file and overwrite anything that's been defined
+ def initialize(&block)
+ load_defaults!
+
+ @path ||= 'backups'
+ @directories ||= Array.new
+ @mirror ||= false
+
+ instance_eval(&block) if block_given?
+
+ @path = path.sub(/^\//, '')
+ end
+
+ ##
+ # Performs the S3Sync operation
+ # First it'll set the Amazon S3 credentials for S3Sync before invoking it,
+ # and once it's finished syncing the files and directories to Amazon S3, it'll
+ # unset these credentials (back to nil values)
+ def perform!
+ set_s3sync_credentials!
+
+ directories.each do |directory|
+ Logger.message("#{ self.class } started syncing '#{ directory }'.")
+ Logger.silent( run("#{ utility(:s3sync) } #{ options } '#{ directory }' '#{ bucket }:#{ path }'") )
+ end
+
+ unset_s3sync_credentials!
+ end
+
+ ##
+ # Returns all the specified S3Sync options, concatenated, ready for the CLI
+ def options
+ [verbose, recursive, mirror].compact.join("\s")
+ end
+
+ ##
+ # Returns S3Sync syntax for enabling mirroring
+ def mirror
+ '--delete' if @mirror
+ end
+
+ ##
+ # Returns S3Sync syntax for syncing recursively
+ def recursive
+ '--recursive'
+ end
+
+ ##
+ # Returns S3Sync syntax for making output verbose
+ def verbose
+ '--verbose'
+ end
+
+ ##
+ # Syntactical suger for the DSL for adding directories
+ def directories(&block)
+ return @directories unless block_given?
+ instance_eval(&block)
+ end
+
+ ##
+ # Adds a path to the @directories array
+ def add(path)
+ @directories << path
+ end
+
+ ##
+ # In order for S3Sync to know what credentials to use, we have to set the
+ # AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, these
+ # evironment variables will be used by S3Sync
+ def set_s3sync_credentials!
+ ENV['AWS_ACCESS_KEY_ID'] = access_key_id
+ ENV['AWS_SECRET_ACCESS_KEY'] = secret_access_key
+ end
+
+ ##
+ # Sets the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY back to nil
+ def unset_s3sync_credentials!
+ ENV['AWS_ACCESS_KEY_ID'] = nil
+ ENV['AWS_SECRET_ACCESS_KEY'] = nil
+ end
+
+ end
+ end
+end
View
2  lib/backup/version.rb
@@ -13,7 +13,7 @@ class Version
# Defines the minor version
# PATCH:
# Defines the patch version
- MAJOR, MINOR, PATCH = 3, 0, 6
+ MAJOR, MINOR, PATCH = 3, 0, 7
##
# Returns the major version ( big release based off of multiple minor releases )
View
1  lib/templates/storage/dropbox
@@ -3,6 +3,7 @@
db.password = 'my_password'
db.api_key = 'my_api_key'
db.api_secret = 'my_api_secret'
+ db.timeout = 300
db.path = '/path/to/my/backups'
db.keep = 25
end
View
12 lib/templates/syncer/s3
@@ -0,0 +1,12 @@
+ sync_with S3 do |s3|
+ s3.access_key_id = "my_access_key_id"
+ s3.secret_access_key = "my_secret_access_key"
+ s3.bucket = "my-bucket"
+ s3.path = "/backups"
+ s3.mirror = true
+
+ s3.directories do |directory|
+ directory.add "/path/to/directory/to/sync"
+ directory.add "/path/to/other/directory/to/sync"
+ end
+ end
View
3  spec/configuration/storage/dropbox_spec.rb
@@ -11,6 +11,7 @@
db.api_secret = 'my_secret'
db.path = 'my_backups'
db.keep = 20
+ db.timeout = 500
end
end
@@ -22,6 +23,7 @@
db.api_secret.should == 'my_secret'
db.path.should == 'my_backups'
db.keep.should == 20
+ db.timeout.should == 500
end
describe '#clear_defaults!' do
@@ -35,6 +37,7 @@
db.api_secret.should == nil
db.path.should == nil
db.keep.should == nil
+ db.timeout.should == nil
end
end
end
View
18 spec/storage/dropbox_spec.rb
@@ -11,6 +11,7 @@
db.api_key = 'my_api_key'
db.api_secret = 'my_secret'
db.keep = 20
+ db.timeout = 500
end
end
@@ -30,6 +31,21 @@
db.api_secret.should == 'my_secret'
db.path.should == 'backups'
db.keep.should == 20
+ db.timeout.should == 500
+ end
+
+ it 'should overwrite the default timeout' do
+ db = Backup::Storage::Dropbox.new do |db|
+ db.timeout = 500
+ end
+
+ db.timeout.should == 500
+ end
+
+ it 'should provide a default timeout' do
+ db = Backup::Storage::Dropbox.new
+
+ db.timeout.should == 300
end
it 'should overwrite the default path' do
@@ -69,7 +85,7 @@
connection.expects(:upload).with(
File.join(Backup::TMP_PATH, "#{ Backup::TIME }.#{ Backup::TRIGGER }.tar"),
File.join('backups', Backup::TRIGGER),
- :timeout => 300
+ :timeout => db.timeout
)
db.send(:transfer!)
View
10 spec/syncer/rsync_spec.rb
@@ -33,7 +33,7 @@
rsync.port.should == "--port='22'"
rsync.path.should == 'backups/'
rsync.mirror.should == "--delete"
- rsync.compress.should == "-z"
+ rsync.compress.should == "--compress"
end
it 'should use the defaults if a particular attribute has not been defined' do
@@ -93,7 +93,7 @@
context 'when true' do
it do
rsync.compress = true
- rsync.compress.should == '-z'
+ rsync.compress.should == '--compress'
end
end
@@ -140,15 +140,15 @@
describe '#options' do
it do
- rsync.options.should == "--archive --delete -z --port='22'"
+ rsync.options.should == "--archive --delete --compress --port='22'"
end
end
describe '#perform' do
- it 'should invoke transfer!' do
+ it 'should invoke the rsync command to transfer the files and directories' do
Backup::Logger.expects(:message).with("Backup::Syncer::RSync started syncing '/some/random/directory' '/another/random/directory'.")
rsync.expects(:utility).with(:rsync).returns(:rsync)
- rsync.expects(:run).with("rsync -vhP --archive --delete -z --port='22' '/some/random/directory' '/another/random/directory' 'my_username@123.45.678.90:backups/'")
+ rsync.expects(:run).with("rsync -vhP --archive --delete --compress --port='22' '/some/random/directory' '/another/random/directory' 'my_username@123.45.678.90:backups/'")
rsync.perform!
end
end
View
131 spec/syncer/s3_spec.rb
@@ -0,0 +1,131 @@
+# encoding: utf-8
+
+require File.dirname(__FILE__) + '/../spec_helper'
+
+describe Backup::Syncer::S3 do
+
+ let(:s3) do
+ Backup::Syncer::S3.new do |s3|
+ s3.access_key_id = 'my_access_key_id'
+ s3.secret_access_key = 'my_secret_access_key'
+ s3.bucket = 'my-bucket'
+ s3.path = "/backups"
+ s3.mirror = true
+
+ s3.directories do |directory|
+ directory.add "/some/random/directory"
+ directory.add "/another/random/directory"
+ end
+ end
+ end
+
+ before do
+ Backup::Configuration::Syncer::S3.clear_defaults!
+ end
+
+ it 'should have defined the configuration properly' do
+ s3.access_key_id.should == 'my_access_key_id'
+ s3.secret_access_key.should == 'my_secret_access_key'
+ s3.bucket.should == 'my-bucket'
+ s3.path.should == 'backups'
+ s3.mirror.should == '--delete'
+ s3.directories.should == ["/some/random/directory", "/another/random/directory"]
+ end
+
+ it 'should use the defaults if a particular attribute has not been defined' do
+ Backup::Configuration::Syncer::S3.defaults do |s3|
+ s3.access_key_id = 'my_access_key_id'
+ s3.bucket = 'my-bucket'
+ s3.path = "/backups"
+ s3.mirror = true
+ end
+
+ s3 = Backup::Syncer::S3.new do |s3|
+ s3.secret_access_key = 'some_secret_access_key'
+ s3.mirror = false
+ end
+
+ s3.access_key_id = 'my_access_key_id'
+ s3.secret_access_key = 'some_secret_access_key'
+ s3.bucket = 'my-bucket'
+ s3.path = "/backups"
+ s3.mirror = false
+ end
+
+ it 'should have its own defaults' do
+ s3 = Backup::Syncer::S3.new
+ s3.path.should == 'backups'
+ s3.directories.should == Array.new
+ s3.mirror.should == nil
+ end
+
+ describe '#mirror' do
+ context 'when true' do
+ it do
+ s3.mirror = true
+ s3.mirror.should == '--delete'
+ end
+ end
+
+ context 'when nil/false' do
+ it do
+ s3.mirror = nil
+ s3.mirror.should == nil
+ end
+
+ it do
+ s3.mirror = false
+ s3.mirror.should == nil
+ end
+ end
+ end
+
+ describe '#recursive' do
+ it do
+ s3.recursive.should == '--recursive'
+ end
+ end
+
+ describe '#verbose' do
+ it do
+ s3.verbose.should == '--verbose'
+ end
+ end
+
+ describe '#directories' do
+ context 'when its empty' do
+ it do
+ s3.directories = []
+ s3.directories.should == []
+ end
+ end
+
+ context 'when it has items' do
+ it do
+ s3.directories = ['directory1', 'directory1/directory2', 'directory1/directory2/directory3']
+ s3.directories.should == ['directory1', 'directory1/directory2', 'directory1/directory2/directory3']
+ end
+ end
+ end
+
+ describe '#options' do
+ it do
+ s3.options.should == "--verbose --recursive --delete"
+ end
+ end
+
+ describe '#perform' do
+ it 'should sync two directories' do
+ s3.expects(:utility).with(:s3sync).returns(:s3sync).twice
+
+ Backup::Logger.expects(:message).with("Backup::Syncer::S3 started syncing '/some/random/directory'.")
+ s3.expects(:run).with("s3sync --verbose --recursive --delete '/some/random/directory' 'my-bucket:backups'")
+
+ Backup::Logger.expects(:message).with("Backup::Syncer::S3 started syncing '/another/random/directory'.")
+ s3.expects(:run).with("s3sync --verbose --recursive --delete '/another/random/directory' 'my-bucket:backups'")
+
+ s3.perform!
+ end
+ end
+
+end
Please sign in to comment.
Something went wrong with that request. Please try again.