Permalink
Browse files

releasing new features

  • Loading branch information...
fguillen committed Jul 2, 2011
1 parent d019b4f commit 4846a9e89545a223431e02852302a0985320eb4e
View
179 README.md
@@ -1,86 +1,121 @@
-## Run Backup
- ruby sweety_backy_execute.rb configuration_file.yml
-
-## Configuration file
-### includes
-Array of folders to make a backup.
+# Sweety Backy
+
+Simple mechanism to **configure and execute backups** of folders and MySQL DBs and store them in **local folder** or **S3 bucket**.
+
+## State
+
+This is a **really beta** version which is working in my servers actually without problems but you have to use it under your own risk.
+
+## Other possibilities
+
+Please take a look of other **Ruby backups gems**:
-Example:
- includes: ['/var/www', '/var/log']
+* http://ruby-toolbox.com/categories/backups.html
-### excludes
-Array of patterns to exclude from backup.
+## How install
-Example:
- excludes: ['/var/www/tmp', '.cvs']
+ gem install 'sweety_backy'
-### path
-Root Path for the backups
+## How to use it
+
+ sweety_backy <config_file>
+
+### Config file
+
+It is a _yaml_ file with all this attributes
-Example:
- path: '/backups'
+ paths: <array of folder paths>
+ databases: <array of database names>
+ yearly: <quantity of yearly backups>
+ monthly: <quantity of monthly backups>
+ weekly: <quantity of weekly backups>
+ daily: <quantity of daily backups>
+ database_user: <database user with read privileges of all datases>
+ database_pass: <database user password>
+ storage_system: { 's3' | 'local' }
+ local_opts: (only if the storage_system is 'local')
+ path: <absoulte path to the root folder of the backups>
+ s3_opts: (only if the storage_system is 's3')
+ bucket: <bucket name>
+ path: <bucket path where the backups will be stored><
+ passwd_file: <path to the S3 credentials>
-### yearly, monhtly, weekly, daily
-Number of yearly, monhtly, weekly, daily backups to keep, starting for the most recent.
+### S3 credentials file
-Example:
- yearly: 2
- monthly: 12
- weekly: 4
- daily: 7
+It is a _yaml_ file with two keys with the S3 credentials:
+
+ access_key_id: "XXX"
+ secret_access_key: "YYY"
+
+### Example
+#### S3 config example
-## Destination Folders
+ # ~/.s3.passwd
+ access_key_id: "XXX"
+ secret_access_key: "YYY"
-As example for date 2010-06-29 with configuration:
- include: '/tmp/hhh'
- exclude: '/tmp/hhh/xxx'
- path: '/tmp/backups'
- years: 2
- months: 6
- weeks: 4
- days: 7
+#### SweetyBacky config example
+
+ # ~/.sweety_backy.conf
+ paths: [ "/Users/fguillen/Develop/Brico", "/Users/fguillen/Develop/Arduino" ]
+ databases: [ "test", "mysql" ]
+ yearly: 1
+ monthly: 2
+ weekly: 3
+ daily: 4
+ database_user: 'root'
+ database_pass: ''
+ storage_system: 's3'
+ s3_opts:
+ bucket: 'sweety_backy'
+ path: 'fguillen'
+ passwd_file: '~/.s3.passwd'
+
+#### Execute
+
+ sweety_backy ~/.sweety_backy.conf
-The files folder will look like this:
-
- /tmp/backups/files/20081231.yearly.tar.gz
- /tmp/backups/files/20091231.yearly.tar.gz
- /tmp/backups/files/20100131.monthly.tar.gz
- /tmp/backups/files/20100228.monthly.tar.gz
- /tmp/backups/files/20100331.monthly.tar.gz
- /tmp/backups/files/20100430.monthly.tar.gz
- /tmp/backups/files/20100531.monthly.tar.gz
- /tmp/backups/files/20100606.weekly.tar.gz
- /tmp/backups/files/20100613.weekly.tar.gz
- /tmp/backups/files/20100620.weekly.tar.gz
- /tmp/backups/files/20100627.weekly.tar.gz
- /tmp/backups/files/20100623.daily.tar.gz
- /tmp/backups/files/20100624.daily.tar.gz
- /tmp/backups/files/20100625.daily.tar.gz
- /tmp/backups/files/20100626.daily.tar.gz
- /tmp/backups/files/20100627.weekly.tar.gz
- /tmp/backups/files/20100628.daily.tar.gz
- /tmp/backups/files/20100629.daily.tar.gz
+#### Result
+
+This will generate a bunch of backups in the _sweety_backy_ bucket like these ones:
+
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110626.weekly.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110703.weekly.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110704.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110705.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110706.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Arduino.20110707.daily.tar.gz
+
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110626.weekly.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110703.weekly.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110704.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110705.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110706.daily.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/files/Users.fguillen.Develop.Brico.20110707.daily.tar.gz
+
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110626.weekly.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110703.weekly.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110704.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110705.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110706.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/test.20110707.daily.sql.tar.gz
+
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110626.weekly.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110703.weekly.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110704.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110705.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110706.daily.sql.tar.gz
+ https://s3.amazonaws.com/sweety_backy/fguillen/databases/mysql.20110707.daily.sql.tar.gz
-The databases folder will look like this:
+... and so on.
+
+### Cron execution example
- /tmp/backups/databases/20081231.yearly.sql.tar.gz
- /tmp/backups/databases/20091231.yearly.sql.tar.gz
- /tmp/backups/databases/20100131.monthly.sql.tar.gz
- /tmp/backups/databases/20100228.monthly.sql.tar.gz
- /tmp/backups/databases/20100331.monthly.sql.tar.gz
- /tmp/backups/databases/20100430.monthly.sql.tar.gz
- /tmp/backups/databases/20100531.monthly.sql.tar.gz
- /tmp/backups/databases/20100606.weekly.sql.tar.gz
- /tmp/backups/databases/20100613.weekly.sql.tar.gz
- /tmp/backups/databases/20100620.weekly.sql.tar.gz
- /tmp/backups/databases/20100627.weekly.sql.tar.gz
- /tmp/backups/databases/20100623.daily.sql.tar.gz
- /tmp/backups/databases/20100624.daily.sql.tar.gz
- /tmp/backups/databases/20100625.daily.sql.tar.gz
- /tmp/backups/databases/20100626.daily.sql.tar.gz
- /tmp/backups/databases/20100627.weekly.sql.tar.gz
- /tmp/backups/databases/20100628.daily.sql.tar.gz
- /tmp/backups/databases/20100629.daily.sql.tar.gz
-
+ # every day at 02:00 am
+ 00 02 * * * sweety_backy /home/fguillen/.sweety_backy.conf >> /var/log/sweety_backy.log 2>&1
+
+## License
+
+MIT License. (c) 2011 Fernando Guillen (http://fernandoguillen.info).
View
@@ -12,4 +12,20 @@ Rake::TestTask.new do |t|
t.libs << '.'
t.test_files = FileList['test/*_test.rb']
t.verbose = true
+end
+
+namespace :test do
+
+ desc "run s3 test in real"
+ task :s3 do
+ test_task =
+ Rake::TestTask.new("s3_tests") do |t|
+ t.libs << '.'
+ t.test_files = FileList['test/s3/*_test.rb']
+ t.verbose = true
+ end
+
+ task("s3_tests").execute
+ end
+
end
View
@@ -1,31 +0,0 @@
-# Sweety Backy
-
-Simple mechanism to **configure and execute backups** of folders and MySQL DBs and store them in **S3**.
-
-## Version 2.0
-
-The version 2.0 will be a completely rewrite of the Version 1.0 implementation, the most important new features will be:
-
-* More modular implementation
-* Distribute it as a Gem
-* More intuitive configuration, maybe a Ruby DSL
-* Storage in S3
-* Recovery system
-
-## Challenges
-
-* Define and implement a DSL
-* Write a modular architecture
-* Make it usable for others than me
-
-## Already existing similar tools
-
-There are already existing and popular tools similar to one I'm trying to build:
-
-* http://ruby-toolbox.com/categories/backups.html
-
-Hope this is not a inconvenient to this project to be accepted since I usually find these tools too much complex or maybe their style doesn't look intuitive to me.
-
-Well, I have to say that I haven't researched with these tools y
-
-
View
@@ -0,0 +1,28 @@
+#!/usr/bin/env ruby
+
+# Use:
+# sweety_backy /path/to/sweety_backy.conf
+
+begin
+ require 'sweety_backy'
+rescue LoadError
+ require 'rubygems'
+ require 'sweety_backy'
+end
+
+require 'benchmark'
+
+if( ARGV[0].nil? )
+ SweetyBacky::Utils.log "use: $ sweety_backy <config_file_path>"
+ exit 1
+end
+
+lapsus_time =
+ Benchmark.realtime do
+ SweetyBacky::Utils.log "--------------------"
+ SweetyBacky::Utils.log "Starting SweetyBacky"
+ sb = SweetyBacky::Runner.new( ARGV[0] )
+ sb.run
+ end
+
+SweetyBacky::Utils.log "SweetyBacky on #{lapsus_time} seconds"
View
@@ -1,18 +0,0 @@
-require 'benchmark'
-# require 'sweety_backy'
-require File.dirname(__FILE__) + "/../lib/sweety_backy"
-
-if( ARGV[0].nil? )
- SweetyBacky::Utils.log "use: $ ruby sweety_backy.rb <config_file_path>"
- exit 1
-end
-
-lapsus_time =
- Benchmark.realtime do
- SweetyBacky::Utils.log "--------------------"
- SweetyBacky::Utils.log "Starting SweetyBacky"
- sb = SweetyBacky::Runner.new( ARGV[0] )
- sb.run
- end
-
-SweetyBacky::Utils.log "SweetyBacky on #{lapsus_time} seconds"
View
@@ -1,4 +1,9 @@
+require 'rubygems'
+require "s3"
+
require "#{File.dirname(__FILE__)}/sweety_backy/version"
require "#{File.dirname(__FILE__)}/sweety_backy/runner"
require "#{File.dirname(__FILE__)}/sweety_backy/utils"
require "#{File.dirname(__FILE__)}/sweety_backy/commander"
+require "#{File.dirname(__FILE__)}/sweety_backy/s3"
+require "#{File.dirname(__FILE__)}/sweety_backy/opts_reader"
@@ -27,32 +27,56 @@ def self.clear( opts )
end
def self.clear_files( opts )
- SweetyBacky::Utils.log "cleaning files"
+ SweetyBacky::Utils.log "cleaning files on #{opts[:working_path]}/files/"
opts[:paths].each do |path|
SweetyBacky::Utils.log "cleaning file #{path}"
[:yearly, :monthly, :weekly, :daily].each do |period|
- Dir.glob( "#{opts[:backup_path]}/files/#{SweetyBacky::Utils.namerize( path )}.*.#{period.to_s}.*" ).sort[0..(-1*(opts[period]+1))].each do |file_path|
- File.delete( file_path )
+ paths_in(
+ "#{opts[:working_path]}/files/#{SweetyBacky::Utils.namerize( path )}.*.#{period.to_s}.*",
+ opts
+ ).sort[0..(-1*(opts[period]+1))].each do |file_path|
+ SweetyBacky::Utils.log "removing: #{file_path}"
+ remove_path( file_path, opts )
end
end
end
end
def self.clear_databases( opts )
- SweetyBacky::Utils.log "cleaning databases"
+ SweetyBacky::Utils.log "cleaning databases on #{opts[:working_path]}/databases/"
opts[:databases].each do |database_name|
SweetyBacky::Utils.log "cleaning database #{database_name}"
[:yearly, :monthly, :weekly, :daily].each do |period|
- Dir.glob( "#{opts[:backup_path]}/databases/#{database_name}.*.#{period.to_s}.*" ).sort[0..(-1*(opts[period]+1))].each do |file_path|
- File.delete( file_path )
+ paths_in(
+ "#{opts[:working_path]}/databases/#{database_name}.*.#{period.to_s}.*",
+ opts
+ ).sort[0..(-1*(opts[period]+1))].each do |file_path|
+ SweetyBacky::Utils.log "removing: #{file_path}"
+ remove_path( file_path, opts )
end
end
end
end
+ def self.paths_in( path, opts )
+ if( opts[:storage_system].to_sym == :s3 )
+ return SweetyBacky::S3.paths_in( path, opts[:s3_opts] )
+ else
+ return Dir.glob( path )
+ end
+ end
+
+ def self.remove_path( path, opts )
+ if( opts[:storage_system].to_sym == :s3 )
+ SweetyBacky::S3.delete( path, opts[:s3_opts] )
+ else
+ File.delete( path )
+ end
+ end
+
end
end
Oops, something went wrong.

0 comments on commit 4846a9e

Please sign in to comment.