Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

releasing new features

  • Loading branch information...
commit 4846a9e89545a223431e02852302a0985320eb4e 1 parent d019b4f
@fguillen authored
View
179 README.md
@@ -1,86 +1,121 @@
-## Run Backup
- ruby sweety_backy_execute.rb configuration_file.yml
-
-## Configuration file
-### includes
-Array of folders to make a backup.
+# Sweety Backy
+
+Simple mechanism to **configure and execute backups** of folders and MySQL DBs and store them in **local folder** or **S3 bucket**.
+
+## State
+
+This is a **really beta** version which is working in my servers actually without problems but you have to use it under your own risk.
+
+## Other possibilities
+
+Please take a look of other **Ruby backups gems**:
-Example:
- includes: ['/var/www', '/var/log']
+* http://ruby-toolbox.com/categories/backups.html
-### excludes
-Array of patterns to exclude from backup.
+## How install
-Example:
- excludes: ['/var/www/tmp', '.cvs']
+ gem install 'sweety_backy'
-### path
-Root Path for the backups
+## How to use it
+
+ sweety_backy <config_file>
+
+### Config file
+
+It is a _yaml_ file with all this attributes
-Example:
- path: '/backups'
+ paths: <array of folder paths>
+ databases: <array of database names>
+ yearly: <quantity of yearly backups>
+ monthly: <quantity of monthly backups>
+ weekly: <quantity of weekly backups>
+ daily: <quantity of daily backups>
+ database_user: <database user with read privileges of all datases>
+ database_pass: <database user password>
+ storage_system: { 's3' | 'local' }
+ local_opts: (only if the storage_system is 'local')
+ path: <absoulte path to the root folder of the backups>
+ s3_opts: (only if the storage_system is 's3')
+ bucket: <bucket name>
+ path: <bucket path where the backups will be stored><
+ passwd_file: <path to the S3 credentials>
-### yearly, monhtly, weekly, daily
-Number of yearly, monhtly, weekly, daily backups to keep, starting for the most recent.
+### S3 credentials file
-Example:
- yearly: 2
- monthly: 12
- weekly: 4
- daily: 7
+It is a _yaml_ file with two keys with the S3 credentials:
+
+ access_key_id: "XXX"
+ secret_access_key: "YYY"
+
+### Example
+#### S3 config example
-## Destination Folders
+ # ~/.s3.passwd
+ access_key_id: "XXX"
+ secret_access_key: "YYY"
-As example for date 2010-06-29 with configuration:
- include: '/tmp/hhh'
- exclude: '/tmp/hhh/xxx'
- path: '/tmp/backups'
- years: 2
- months: 6
- weeks: 4
- days: 7
+#### SweetyBacky config example
+
+ # ~/.sweety_backy.conf
+ paths: [ "/Users/fguillen/Develop/Brico", "/Users/fguillen/Develop/Arduino" ]
+ databases: [ "test", "mysql" ]
+ yearly: 1
+ monthly: 2
+ weekly: 3
+ daily: 4
+ database_user: 'root'
+ database_pass: ''
+ storage_system: 's3'
+ s3_opts:
+ bucket: 'sweety_backy'
+ path: 'fguillen'
+ passwd_file: '~/.s3.passwd'
+
+#### Execute
+
+ sweety_backy ~/.sweety_backy.conf
-The files folder will look like this:
-
- /tmp/backups/files/20081231.yearly.tar.gz
- /tmp/backups/files/20091231.yearly.tar.gz
- /tmp/backups/files/20100131.monthly.tar.gz
- /tmp/backups/files/20100228.monthly.tar.gz
- /tmp/backups/files/20100331.monthly.tar.gz
- /tmp/backups/files/20100430.monthly.tar.gz
- /tmp/backups/files/20100531.monthly.tar.gz
- /tmp/backups/files/20100606.weekly.tar.gz
- /tmp/backups/files/20100613.weekly.tar.gz
- /tmp/backups/files/20100620.weekly.tar.gz
- /tmp/backups/files/20100627.weekly.tar.gz
- /tmp/backups/files/20100623.daily.tar.gz
- /tmp/backups/files/20100624.daily.tar.gz
- /tmp/backups/files/20100625.daily.tar.gz
- /tmp/backups/files/20100626.daily.tar.gz
- /tmp/backups/files/20100627.weekly.tar.gz
- /tmp/backups/files/20100628.daily.tar.gz
- /tmp/backups/files/20100629.daily.tar.gz
+#### Result
+
+This will generate a bunch of backups in the _sweety_backy_ bucket like these ones:
+
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110626.weekly.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110703.weekly.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110704.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110705.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110706.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110707.daily.tar.gz
+
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110626.weekly.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110703.weekly.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110704.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110705.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110706.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110707.daily.tar.gz
+
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110626.weekly.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110703.weekly.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110704.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110705.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110706.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110707.daily.sql.tar.gz
+
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110626.weekly.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110703.weekly.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110704.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110705.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110706.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110707.daily.sql.tar.gz
-The databases folder will look like this:
+... and so on.
+
+### Cron execution example
- /tmp/backups/databases/20081231.yearly.sql.tar.gz
- /tmp/backups/databases/20091231.yearly.sql.tar.gz
- /tmp/backups/databases/20100131.monthly.sql.tar.gz
- /tmp/backups/databases/20100228.monthly.sql.tar.gz
- /tmp/backups/databases/20100331.monthly.sql.tar.gz
- /tmp/backups/databases/20100430.monthly.sql.tar.gz
- /tmp/backups/databases/20100531.monthly.sql.tar.gz
- /tmp/backups/databases/20100606.weekly.sql.tar.gz
- /tmp/backups/databases/20100613.weekly.sql.tar.gz
- /tmp/backups/databases/20100620.weekly.sql.tar.gz
- /tmp/backups/databases/20100627.weekly.sql.tar.gz
- /tmp/backups/databases/20100623.daily.sql.tar.gz
- /tmp/backups/databases/20100624.daily.sql.tar.gz
- /tmp/backups/databases/20100625.daily.sql.tar.gz
- /tmp/backups/databases/20100626.daily.sql.tar.gz
- /tmp/backups/databases/20100627.weekly.sql.tar.gz
- /tmp/backups/databases/20100628.daily.sql.tar.gz
- /tmp/backups/databases/20100629.daily.sql.tar.gz
-
+ # every day at 02:00 am
+ 00 02 * * * sweety_backy /home/fguillen/.sweety_backy.conf >> /var/log/sweety_backy.log 2>&1
+
+## License
+
+MIT License. (c) 2011 Fernando Guillen (http://fernandoguillen.info).
View
16 Rakefile
@@ -12,4 +12,20 @@ Rake::TestTask.new do |t|
t.libs << '.'
t.test_files = FileList['test/*_test.rb']
t.verbose = true
+end
+
+namespace :test do
+
+ desc "run s3 test in real"
+ task :s3 do
+ test_task =
+ Rake::TestTask.new("s3_tests") do |t|
+ t.libs << '.'
+ t.test_files = FileList['test/s3/*_test.rb']
+ t.verbose = true
+ end
+
+ task("s3_tests").execute
+ end
+
end
View
31 VERSION-2.0.md
@@ -1,31 +0,0 @@
-# Sweety Backy
-
-Simple mechanism to **configure and execute backups** of folders and MySQL DBs and store them in **S3**.
-
-## Version 2.0
-
-The version 2.0 will be a completely rewrite of the Version 1.0 implementation, the most important new features will be:
-
-* More modular implementation
-* Distribute it as a Gem
-* More intuitive configuration, maybe a Ruby DSL
-* Storage in S3
-* Recovery system
-
-## Challenges
-
-* Define and implement a DSL
-* Write a modular architecture
-* Make it usable for others than me
-
-## Already existing similar tools
-
-There are already existing and popular tools similar to one I'm trying to build:
-
-* http://ruby-toolbox.com/categories/backups.html
-
-Hope this is not a inconvenient to this project to be accepted since I usually find these tools too much complex or maybe their style doesn't look intuitive to me.
-
-Well, I have to say that I haven't researched with these tools y
-
-
View
28 bin/sweety_backy
@@ -0,0 +1,28 @@
+#!/usr/bin/env ruby
+
+# Use:
+# sweety_backy /path/to/sweety_backy.conf
+
+begin
+ require 'sweety_backy'
+rescue LoadError
+ require 'rubygems'
+ require 'sweety_backy'
+end
+
+require 'benchmark'
+
+if( ARGV[0].nil? )
+ SweetyBacky::Utils.log "use: $ sweety_backy <config_file_path>"
+ exit 1
+end
+
+lapsus_time =
+ Benchmark.realtime do
+ SweetyBacky::Utils.log "--------------------"
+ SweetyBacky::Utils.log "Starting SweetyBacky"
+ sb = SweetyBacky::Runner.new( ARGV[0] )
+ sb.run
+ end
+
+SweetyBacky::Utils.log "SweetyBacky on #{lapsus_time} seconds"
View
18 bin/sweety_backy.rb
@@ -1,18 +0,0 @@
-require 'benchmark'
-# require 'sweety_backy'
-require File.dirname(__FILE__) + "/../lib/sweety_backy"
-
-if( ARGV[0].nil? )
- SweetyBacky::Utils.log "use: $ ruby sweety_backy.rb <config_file_path>"
- exit 1
-end
-
-lapsus_time =
- Benchmark.realtime do
- SweetyBacky::Utils.log "--------------------"
- SweetyBacky::Utils.log "Starting SweetyBacky"
- sb = SweetyBacky::Runner.new( ARGV[0] )
- sb.run
- end
-
-SweetyBacky::Utils.log "SweetyBacky on #{lapsus_time} seconds"
View
5 lib/sweety_backy.rb
@@ -1,4 +1,9 @@
+require 'rubygems'
+require "s3"
+
require "#{File.dirname(__FILE__)}/sweety_backy/version"
require "#{File.dirname(__FILE__)}/sweety_backy/runner"
require "#{File.dirname(__FILE__)}/sweety_backy/utils"
require "#{File.dirname(__FILE__)}/sweety_backy/commander"
+require "#{File.dirname(__FILE__)}/sweety_backy/s3"
+require "#{File.dirname(__FILE__)}/sweety_backy/opts_reader"
View
36 lib/sweety_backy/commander.rb
@@ -27,32 +27,56 @@ def self.clear( opts )
end
def self.clear_files( opts )
- SweetyBacky::Utils.log "cleaning files"
+ SweetyBacky::Utils.log "cleaning files on #{opts[:working_path]}/files/"
opts[:paths].each do |path|
SweetyBacky::Utils.log "cleaning file #{path}"
[:yearly, :monthly, :weekly, :daily].each do |period|
- Dir.glob( "#{opts[:backup_path]}/files/#{SweetyBacky::Utils.namerize( path )}.*.#{period.to_s}.*" ).sort[0..(-1*(opts[period]+1))].each do |file_path|
- File.delete( file_path )
+ paths_in(
+ "#{opts[:working_path]}/files/#{SweetyBacky::Utils.namerize( path )}.*.#{period.to_s}.*",
+ opts
+ ).sort[0..(-1*(opts[period]+1))].each do |file_path|
+ SweetyBacky::Utils.log "removing: #{file_path}"
+ remove_path( file_path, opts )
end
end
end
end
def self.clear_databases( opts )
- SweetyBacky::Utils.log "cleaning databases"
+ SweetyBacky::Utils.log "cleaning databases on #{opts[:working_path]}/databases/"
opts[:databases].each do |database_name|
SweetyBacky::Utils.log "cleaning database #{database_name}"
[:yearly, :monthly, :weekly, :daily].each do |period|
- Dir.glob( "#{opts[:backup_path]}/databases/#{database_name}.*.#{period.to_s}.*" ).sort[0..(-1*(opts[period]+1))].each do |file_path|
- File.delete( file_path )
+ paths_in(
+ "#{opts[:working_path]}/databases/#{database_name}.*.#{period.to_s}.*",
+ opts
+ ).sort[0..(-1*(opts[period]+1))].each do |file_path|
+ SweetyBacky::Utils.log "removing: #{file_path}"
+ remove_path( file_path, opts )
end
end
end
end
+ def self.paths_in( path, opts )
+ if( opts[:storage_system].to_sym == :s3 )
+ return SweetyBacky::S3.paths_in( path, opts[:s3_opts] )
+ else
+ return Dir.glob( path )
+ end
+ end
+
+ def self.remove_path( path, opts )
+ if( opts[:storage_system].to_sym == :s3 )
+ SweetyBacky::S3.delete( path, opts[:s3_opts] )
+ else
+ File.delete( path )
+ end
+ end
+
end
end
View
47 lib/sweety_backy/opts_reader.rb
@@ -0,0 +1,47 @@
+module SweetyBacky
+ module OptsReader
+ def self.read_opts( conf_path )
+ SweetyBacky::Utils::log "conf_path: #{conf_path}"
+
+ opts = YAML.load( File.read( conf_path ) )
+ new_opts = {}
+
+ # symbolize keys
+ opts.keys.each do |key|
+
+ if( opts[key].is_a? Hash )
+ new_opts[key.to_sym] = {}
+ opts[key].keys.each do |key2|
+ new_opts[key.to_sym][key2.to_sym] = opts[key][key2]
+ end
+ else
+ new_opts[key.to_sym] = opts[key]
+ end
+
+ end
+
+ log_configuration( new_opts )
+
+ # TODO: test all options are ok
+
+ return new_opts
+ end
+
+ def self.log_configuration( opts )
+ SweetyBacky::Utils::log "configuration:"
+ SweetyBacky::Utils::log "------------"
+ opts.each_pair do |key, value|
+ if( value.is_a? Array )
+ SweetyBacky::Utils::log "#{key}: #{value.join(' | ')}"
+ elsif( value.is_a? Hash )
+ value.each_pair do |key2, value2|
+ SweetyBacky::Utils::log "#{key} => #{key2}: #{value2}"
+ end
+ else
+ SweetyBacky::Utils::log "#{key}: #{value}"
+ end
+ end
+ end
+
+ end
+end
View
36 lib/sweety_backy/runner.rb
@@ -10,7 +10,7 @@ class Runner
def initialize( path = nil )
if( !path.nil? )
- config( SweetyBacky::Utils.read_opts( path ) )
+ config( SweetyBacky::OptsReader.read_opts( path ) )
end
end
@@ -22,9 +22,14 @@ def config( opts )
:monthly => 1,
:weekly => 2,
:daily => 4,
- :tar_path => '/usr/bin/tar',
- :mysqldump_path => '/usr/bin/mysqldump'
+ :storage_system => :local
}.merge( opts )
+
+ if( @opts[:storage_system].to_sym == :s3 )
+ @opts[:working_path] = File.join( Dir::tmpdir, "sweety_backy_#{Time.now.to_i}" )
+ else
+ @opts[:working_path] = @opts[:local_opts][:path]
+ end
end
def do_backup
@@ -34,15 +39,35 @@ def do_backup
def do_files_backup
@opts[:paths].each do |path|
- backup_path = "#{@opts[:backup_path]}/files/#{SweetyBacky::Utils.namerize( path )}.#{Date.today.strftime('%Y%m%d')}.#{SweetyBacky::Utils.period}.tar.gz"
+ backup_path = "#{@opts[:working_path]}/files/#{SweetyBacky::Utils.namerize( path )}.#{Date.today.strftime('%Y%m%d')}.#{SweetyBacky::Utils.period}.tar.gz"
SweetyBacky::Commander.do_files_backup( path, backup_path, @opts )
+
+ if( @opts[:storage_system].to_sym == :s3 )
+ SweetyBacky::S3.upload(
+ backup_path,
+ "#{@opts[:s3_opts][:path]}/files/#{File.basename( backup_path )}",
+ @opts[:s3_opts]
+ )
+
+ FileUtils.rm_rf backup_path
+ end
end
end
def do_databases_backup
@opts[:databases].each do |database_name|
- backup_path = "#{@opts[:backup_path]}/databases/#{database_name}.#{Date.today.strftime('%Y%m%d')}.#{SweetyBacky::Utils.period}.sql.tar.gz"
+ backup_path = "#{@opts[:working_path]}/databases/#{database_name}.#{Date.today.strftime('%Y%m%d')}.#{SweetyBacky::Utils.period}.sql.tar.gz"
SweetyBacky::Commander.do_database_backup( database_name, backup_path, @opts)
+
+ if( @opts[:storage_system].to_sym == :s3 )
+ SweetyBacky::S3.upload(
+ backup_path,
+ "#{@opts[:s3_opts][:path]}/databases/#{File.basename( backup_path )}",
+ @opts[:s3_opts]
+ )
+
+ FileUtils.rm_rf backup_path
+ end
end
end
@@ -52,6 +77,7 @@ def run
SweetyBacky::Commander.clear( @opts )
rescue => e
SweetyBacky::Utils.log "ERROR: #{e}"
+ SweetyBacky::Utils.log "BACKTRACE: #{e.backtrace.join("\n")}"
SweetyBacky::Utils.log "I should send and email at this moment"
end
end
View
56 lib/sweety_backy/s3.rb
@@ -0,0 +1,56 @@
+module SweetyBacky
+ class S3
+ def self.upload( path, backup_path, opts )
+ SweetyBacky::Utils.log( "S3 uploading: #{path} to #{opts[:bucket]}/#{backup_path}" )
+
+ s3 = ::S3::Service.new( read_s3_password( opts[:passwd_file] ) )
+ bucket = s3.bucket( opts[:bucket] )
+
+ if !bucket.exists?
+ bucket = s3.buckets.build( opts[:bucket] )
+ bucket.save
+ end
+
+ object = bucket.objects.build( backup_path )
+ object.content = File.open( path )
+ object.save
+ end
+
+ def self.object( path, opts )
+ s3 = ::S3::Service.new( read_s3_password( opts[:passwd_file] ) )
+ bucket = s3.buckets.find( opts[:bucket] )
+ object = bucket.objects.find( path )
+
+ return object
+ end
+
+ def self.paths_in( path, opts )
+ s3 = ::S3::Service.new( read_s3_password( opts[:passwd_file] ) )
+ bucket = s3.buckets.find( opts[:bucket] )
+
+ regex = Regexp.escape( path ).gsub('\*', '.*').gsub('\?', '.')
+
+ objects = bucket.objects.select { |e| e.key =~ /#{regex}/ }
+ paths = objects.map(&:key)
+
+ return paths
+ end
+
+ def self.read_s3_password( path )
+ opts = YAML.load( File.read( File.expand_path path ) )
+ new_opts = {}
+
+ # symbolize keys
+ opts.keys.each do |key|
+ new_opts[key.to_sym] = opts[key]
+ end
+
+ return new_opts
+ end
+
+ def self.delete( path, opts )
+ SweetyBacky::S3.object( path, opts ).destroy
+ end
+
+ end
+end
View
0  lib/sweety_backy/sweety_backy_execute.rb
No changes.
View
20 lib/sweety_backy/utils.rb
@@ -38,25 +38,5 @@ def self.namerize( path )
path.gsub('/', '.').gsub(/^\./, '')
end
- def self.read_opts( conf_path )
- SweetyBacky::Utils::log "conf_path: #{conf_path}"
-
- opts = YAML.load( File.read( conf_path ) )
-
- # symbolize keys
- opts.keys.each do |key|
- opts[key.to_sym] = opts.delete(key)
- end
-
- SweetyBacky::Utils::log "configuration:"
- SweetyBacky::Utils::log "------------"
- opts.each_pair do |key, value|
- SweetyBacky::Utils::log "#{key}: #{(value.instance_of? Array) ? value.join(' | ') : value}"
- end
-
- # TODO: test all options are ok
-
- return opts
- end
end
end
View
BIN  server1/backups/databases/test.20100104.daily.sql.tar.gz
Binary file not shown
View
BIN  ...ps/files/Users.fguillen.Develop.Ruby.SweetyBacky.test.fixtures.path.20100104.daily.tar.gz
Binary file not shown
View
6 sweety_backy.gemspec
@@ -8,14 +8,16 @@ Gem::Specification.new do |s|
s.authors = ["Fernando Guillen"]
s.email = ["fguillen.mail@gmail.com"]
s.homepage = ""
- s.summary = %q{TODO: Write a gem summary}
- s.description = %q{TODO: Write a gem description}
+ s.summary = "Ruby backup mechanism"
+ s.description = "Simple mechanism to configure and execute backups of folders and MySQL DBs and store them in local folder or S3 bucket"
s.rubyforge_project = "SweetyBacky"
s.add_development_dependency "bundler", ">= 1.0.0.rc.6"
s.add_development_dependency "mocha"
s.add_development_dependency "delorean"
+
+ s.add_dependency "s3"
s.files = `git ls-files`.split("\n")
s.test_files = `git ls-files -- {test,spec,features}/*`.split("\n")
View
10 test/commander_test.rb
@@ -1,13 +1,13 @@
require "#{File.dirname(__FILE__)}/test_helper"
-class SweetyBackyTest < Test::Unit::TestCase
+class CommanderTest < Test::Unit::TestCase
def setup
SweetyBacky::Utils.stubs(:log)
# tmp dir
@tmp_dir = File.join( Dir::tmpdir, "sweety_backy_#{Time.now.to_i}" )
- Dir.mkdir( @tmp_dir )
+ Dir.mkdir( @tmp_dir ) unless File.exists?(@tmp_dir)
end
def teardown
@@ -53,7 +53,11 @@ def test_clear
:monthly => 2,
:weekly => 3,
:daily => 4,
- :backup_path => @tmp_dir
+ :storage_system => :local,
+ :local_opts => {
+ :path => @tmp_dir
+ },
+ :working_path => @tmp_dir
}
Dir.mkdir( "#{@tmp_dir}/files" ) unless File.exists?( "#{@tmp_dir}/files" )
View
9 test/fixtures/config.yml
@@ -1,9 +0,0 @@
-paths: [ "path1", "path2" ]
-databases: [ "db1", "db2" ]
-yearly: 1
-monthly: 2
-weekly: 3
-daily: 4
-backup_path: '/backup_path'
-database_user: 'database_user'
-database_pass: 'database_pass'
View
13 test/fixtures/config_s3.yml
@@ -0,0 +1,13 @@
+paths: [ "path1", "path2" ]
+databases: [ "db1", "db2" ]
+yearly: 1
+monthly: 2
+weekly: 3
+daily: 4
+database_user: 'database_user'
+database_pass: 'database_pass'
+storage_system: 's3'
+s3_opts:
+ bucket: 'bucket_name'
+ path: 's3/path/path'
+ passwd_file: '/path/.s3.passwd'
View
1  test/fixtures/file.txt
@@ -0,0 +1 @@
+file
View
2  test/fixtures/s3.passwd
@@ -0,0 +1,2 @@
+access_key_id: "access_key"
+secret_access_key: "secret_access"
View
25 test/opts_reader_test.rb
@@ -0,0 +1,25 @@
+require "#{File.dirname(__FILE__)}/test_helper"
+
+class OptsReaderTest < Test::Unit::TestCase
+ def setup
+ SweetyBacky::Utils.stubs(:log)
+ end
+
+ def test_read_opts
+ opts = SweetyBacky::OptsReader.read_opts( "#{FIXTURES_PATH}/config_s3.yml" )
+
+ assert_equal( [ "path1", "path2" ], opts[:paths] )
+ assert_equal( [ "db1", "db2" ], opts[:databases] )
+ assert_equal( 1, opts[:yearly] )
+ assert_equal( 2, opts[:monthly] )
+ assert_equal( 3, opts[:weekly] )
+ assert_equal( 4, opts[:daily] )
+ assert_equal( 'database_user', opts[:database_user] )
+ assert_equal( 'database_pass', opts[:database_pass] )
+ assert_equal( 'bucket_name', opts[:s3_opts][:bucket] )
+ assert_equal( 's3/path/path', opts[:s3_opts][:path] )
+ assert_equal( '/path/.s3.passwd', opts[:s3_opts][:passwd_file] )
+ end
+
+
+end
View
35 test/runner_test.rb
@@ -4,10 +4,10 @@ class RunnerTest < Test::Unit::TestCase
def setup
SweetyBacky::Utils.stubs(:log)
-
+
# tmp dir
@tmp_dir = File.join( Dir::tmpdir, "sweety_backy_#{Time.now.to_i}" )
- Dir.mkdir( @tmp_dir )
+ Dir.mkdir( @tmp_dir ) unless File.exists?(@tmp_dir)
# runner
@opts = {
@@ -17,9 +17,12 @@ def setup
:monthly => 1,
:weekly => 2,
:daily => 4,
- :backup_path => @tmp_dir,
:database_user => 'test',
- :database_pass => ''
+ :database_pass => '',
+ :storage_system => :local,
+ :local_opts => {
+ :path => @tmp_dir
+ }
}
@runner = SweetyBacky::Runner.new
@@ -74,17 +77,25 @@ def test_run
end
def test_initialize_with_config_file
- runner = SweetyBacky::Runner.new( "#{FIXTURES_PATH}/config.yml" )
+ SweetyBacky::OptsReader.expects( :read_opts ).with( '/path/config.yml' ).returns(
+ {
+ :paths => [ 'pepe', 'juan' ],
+ :local_opts => {
+ :path => '/local/path'
+ }
+ }
+ )
+
+ runner = SweetyBacky::Runner.new( "/path/config.yml" )
- assert_equal( [ "path1", "path2" ], runner.opts[:paths] )
- assert_equal( [ "db1", "db2" ], runner.opts[:databases] )
+ assert_equal( [ "pepe", "juan" ], runner.opts[:paths] )
+ assert_equal( [], runner.opts[:databases] )
assert_equal( 1, runner.opts[:yearly] )
- assert_equal( 2, runner.opts[:monthly] )
- assert_equal( 3, runner.opts[:weekly] )
+ assert_equal( 1, runner.opts[:monthly] )
+ assert_equal( 2, runner.opts[:weekly] )
assert_equal( 4, runner.opts[:daily] )
- assert_equal( '/backup_path', runner.opts[:backup_path] )
- assert_equal( 'database_user', runner.opts[:database_user] )
- assert_equal( 'database_pass', runner.opts[:database_pass] )
+ assert_equal( :local, runner.opts[:storage_system] )
+ assert_equal( '/local/path', runner.opts[:local_opts][:path] )
end
end
View
100 test/s3/commander_s3_test.rb
@@ -0,0 +1,100 @@
+require "#{File.dirname(__FILE__)}/../test_helper"
+
+class CommanderS3Test < Test::Unit::TestCase
+
+ def setup
+ SweetyBacky::Utils.stubs(:log)
+
+ @opts = {
+ :paths => [ 'name1', 'name2' ],
+ :databases => [ 'name1', 'name2' ],
+ :yearly => 1,
+ :monthly => 2,
+ :weekly => 3,
+ :daily => 4,
+ :storage_system => :s3,
+ :s3_opts => {
+ :bucket => 'sweety_backy_test',
+ :path => 'test/path',
+ :passwd_file => '~/.s3.passwd'
+ },
+ :working_path => @tmp_dir
+ }
+
+ s3 = ::S3::Service.new( SweetyBacky::S3.read_s3_password( @opts[:s3_opts][:passwd_file] ) )
+
+ @bucket = s3.buckets.build( @opts[:s3_opts][:bucket] )
+ @bucket.save
+ end
+
+ def teardown
+ @bucket.destroy( true )
+ end
+
+ def test_clear
+ [
+ 'name1.20081231.yearly',
+ 'name1.20081232.yearly',
+ 'name2.20091231.yearly',
+ 'name1.20100131.monthly',
+ 'name1.20100228.monthly',
+ 'name1.20100331.monthly',
+ 'name2.20100430.monthly',
+ 'name2.20100531.monthly',
+ 'name2.20100630.monthly',
+ 'name1.20100704.weekly',
+ 'name1.20100711.weekly',
+ 'name1.20100718.weekly',
+ 'name1.20100725.weekly',
+ 'name1.20100720.daily',
+ 'name2.20100721.daily',
+ 'name2.20100722.daily',
+ 'name2.20100723.daily',
+ 'name2.20100724.daily',
+ 'name2.20100726.daily'
+ ].each do |file_part|
+ SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "#{@opts[:s3_opts][:path]}/files/#{file_part}.tar.gz", @opts[:s3_opts] )
+ SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "#{@opts[:s3_opts][:path]}/databases/#{file_part}.sql.tar.gz", @opts[:s3_opts] )
+ end
+
+ SweetyBacky::Commander.clear( @opts )
+
+ files_keeped = SweetyBacky::S3.paths_in( "#{@opts[:s3_opts][:path]}/files/*", @opts[:s3_opts] ).join( "\n" )
+ databases_keeped = SweetyBacky::S3.paths_in( "#{@opts[:s3_opts][:path]}/databases/*", @opts[:s3_opts] ).join( "\n" )
+
+ # files to keep
+ [
+ 'name1.20081232.yearly',
+ 'name2.20091231.yearly',
+ 'name1.20100228.monthly',
+ 'name1.20100331.monthly',
+ 'name2.20100531.monthly',
+ 'name2.20100630.monthly',
+ 'name1.20100718.weekly',
+ 'name1.20100725.weekly',
+ 'name1.20100720.daily',
+ 'name2.20100722.daily',
+ 'name2.20100723.daily',
+ 'name2.20100724.daily',
+ 'name2.20100726.daily'
+ ].each do |file_part|
+ assert_match( "#{file_part}.tar.gz", files_keeped )
+ assert_match( "#{file_part}.sql.tar.gz", databases_keeped )
+ end
+
+ # files to deleted
+ [
+ 'name1.20081231.yearly',
+ 'name1.20100131.monthly',
+ 'name2.20100430.monthly',
+ 'name1.20100704.weekly',
+ 'name2.20100721.daily'
+ ].each do |file_part|
+ assert_no_match( /#{file_part}.tar.gz/, files_keeped )
+ assert_no_match( /#{file_part}.sql.tar.gz/, databases_keeped )
+ end
+ end
+
+
+end
+
View
59 test/s3/runner_s3_test.rb
@@ -0,0 +1,59 @@
+require "#{File.dirname(__FILE__)}/../test_helper"
+
+class RunnerS3Test < Test::Unit::TestCase
+
+ def setup
+ SweetyBacky::Utils.stubs(:log)
+
+ # runner
+ @opts = {
+ :paths => [ "#{FIXTURES_PATH}/path" ],
+ :databases => [ "test" ],
+ :yearly => 1,
+ :monthly => 1,
+ :weekly => 2,
+ :daily => 4,
+ :database_user => 'test',
+ :database_pass => '',
+ :storage_system => :s3,
+ :s3_opts => {
+ :bucket => 'sweety_backy_test',
+ :passwd_file => '~/.s3.passwd',
+ :path => 'test/path'
+ }
+ }
+
+ @runner = SweetyBacky::Runner.new
+ @runner.config( @opts )
+
+ s3 = ::S3::Service.new( SweetyBacky::S3.read_s3_password( @opts[:s3_opts][:passwd_file] ) )
+ @bucket = s3.buckets.build( @opts[:s3_opts][:bucket] )
+ @bucket.save
+ end
+
+ def teardown
+ @bucket.destroy( true )
+ end
+
+ def test_do_backup_daily
+ SweetyBacky::Utils.stubs( :period ).returns( 'daily' )
+
+ @runner.do_backup
+
+ assert(
+ @bucket.
+ object(
+ "test/path/files/#{SweetyBacky::Utils.namerize( @opts[:paths][0] )}.#{Date.today.strftime('%Y%m%d')}.daily.tar.gz"
+ ).exists?
+ )
+
+ assert(
+ @bucket.
+ object(
+ "test/path/databases/test.#{Date.today.strftime('%Y%m%d')}.daily.sql.tar.gz"
+ ).exists?
+ )
+ end
+
+end
+
View
61 test/s3/s3_test.rb
@@ -0,0 +1,61 @@
+require "#{File.dirname(__FILE__)}/../test_helper"
+
+
+class S3Test < Test::Unit::TestCase
+ def setup
+ SweetyBacky::Utils.stubs(:log)
+
+ @opts = {
+ :bucket => 'sweety_backy_test',
+ :path => 'test_path',
+ :passwd_file => '~/.s3.passwd'
+ }
+
+ s3 = ::S3::Service.new( SweetyBacky::S3.read_s3_password( @opts[:passwd_file] ) )
+ @bucket = s3.buckets.build( @opts[:bucket] )
+ @bucket.save
+ end
+
+ def teardown
+ @bucket.destroy( true )
+ end
+
+ def test_upload
+ SweetyBacky::S3.upload(
+ "#{FIXTURES_PATH}/file.txt",
+ "test/path/file.txt",
+ @opts
+ )
+
+ assert_equal(
+ File.read( "#{FIXTURES_PATH}/file.txt" ),
+ SweetyBacky::S3.object( "test/path/file.txt", @opts ).content
+ )
+ end
+
+ def test_paths_in
+ SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file1.txt", @opts )
+ SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file2.txt", @opts )
+ SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file3.txt", @opts )
+ SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/other_file.txt", @opts )
+
+ paths = SweetyBacky::S3.paths_in( "test/path/file*.txt", @opts )
+
+ assert_equal(3, paths.size)
+ assert( ( paths.include? "test/path/file1.txt" ) )
+ assert( ( paths.include? "test/path/file2.txt" ) )
+ assert( ( paths.include? "test/path/file3.txt" ) )
+ assert( ( !paths.include? "test/path/other_file.txt" ) )
+ end
+
+ def test_delete
+ SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file1.txt", @opts )
+ SweetyBacky::S3.upload( "#{FIXTURES_PATH}/file.txt", "test/path/file2.txt", @opts )
+
+ SweetyBacky::S3.delete( "test/path/file2.txt", @opts )
+
+ assert( @bucket.object( "test/path/file1.txt" ).exists? )
+ assert( !@bucket.object( "test/path/file2.txt" ).exists? )
+ end
+
+end
View
15 test/utils_test.rb
@@ -21,19 +21,4 @@ def test_namerize
assert_equal( 'path', SweetyBacky::Utils.namerize( 'path' ) )
end
- def test_read_opts
- opts = SweetyBacky::Utils.read_opts( "#{FIXTURES_PATH}/config.yml" )
-
- assert_equal( [ "path1", "path2" ], opts[:paths] )
- assert_equal( [ "db1", "db2" ], opts[:databases] )
- assert_equal( 1, opts[:yearly] )
- assert_equal( 2, opts[:monthly] )
- assert_equal( 3, opts[:weekly] )
- assert_equal( 4, opts[:daily] )
- assert_equal( '/backup_path', opts[:backup_path] )
- assert_equal( 'database_user', opts[:database_user] )
- assert_equal( 'database_pass', opts[:database_pass] )
- end
-
-
end
Please sign in to comment.
Something went wrong with that request. Please try again.