Browse files

Updated ETA in Readme.

  • Loading branch information...
1 parent 8f114f1 commit 4d732990666efb134482f378be5be89b49eae75c Michael van Rooijen committed Mar 6, 2011
Showing with 5 additions and 6 deletions.
  1. +5 −6
@@ -3,6 +3,11 @@ Michael van Rooijen ( [@meskyanichi](!/meskyanichi) )
**This is the Backup 3 place holder page**, [click here if you were looking for Backup 2](
+Backup 3 - ETA (Update)
+I have delayed the Backup 3 release just a little because I'm lacking a bit of time to finish it up. I said it'd be out just before March, but as it stands now I think I'll need until around mid-end March 2011. I also want proper documentation on release so people can pick it up without hassle.
Backup 3 - Coming Soon!
@@ -27,12 +32,6 @@ I am currently hard at work developing Backup 3.0. This is a __100% rewrite__ of
These are a few of the goals I've set for the next release. Feel free the view the source. I probably won't release the initial version with __every__ feature that Backup 2 has when it first comes out, but it'll have plenty of features to start off with. Also, it'll be __a lot__ easier to maintain it and improve it now that it has good test coverage, and is more modular and extensible. So expect a lot more to come in Backup 3 in the future! I am really excited about this one.
-Backup 3 - ETA
-There is no ETA yet, but I hope to push the initial version out before __March - 2011__. Then we take it from there. ;)
Backup 3 - No Ruby on Rails support?

5 comments on commit 4d73299

Great work! Looks solid. :)


mrrooijen replied Mar 6, 2011

Thanks! Yes, looking forward to this release. It'll be a huge leap from Backup 2 to 3. I'm also looking for ways to implement RSync for incremental backups. This won't neccesarily work for Amazon S3, CloudFiles, Dropbox, but for regular servers. I actually found a cloud host ( GoGrid - ) and they seem to support RSync connections.

So using that, if you have Backups that are at 2GB+, rather than pushing them to S3 or anywhere else, it'd be much more cost-effective to push it to GoGrid with the RSync protocol since it'll only transfer the bytes of the Backup that are actually different, rather than remove the whole file (cycle) and push a whole new 2GB+ file. Gets expensive quickly, especially when you transfer backups often.

First thing im looking in to is RSyncing to regular servers. Already sandboxed with this. It seems to work, the downside is that you can't compress (and probably) encrypt your data if you want to do this, but it'll still be a lot cheaper since if you have a file of 2GB transferred, and the next backup grows to be 2.1GB, it'll only transfer about 100MB instead of the full 2100MB. This'll also allow for short-interval backups for a low price.

For remote servers I practically have a working branch, just need to correctly implement it in the develop branch. Next I'll probably get a GoGrid account for their storage service to see how I make Backup work with that, and probably recommend people to use that along with RSync to keep costs low, and updates frequent against low prices. Cause really, transferring 2gb+ every time becomes quite pricy.


mrrooijen replied Mar 6, 2011

Actually you can use the SSH protocol with the RSync utility to do secure connections anyway so the encryption part isn't a problem. And RSync supports encryption during the transfer so that also helps even further minimizing bandwidth fees. ( And for GoGrid, i believe there aren't even any bandwidth fees to begin with ). But I'll have to look in to it soon.

Cool. Long replies means high ambitions. :) This might be useful: I doubt people would choose GoGrid in front of Amazon S3 in general, so assuming such is probably not a good assumption - but the link above looks like some kind of S3-rsync solution.


mrrooijen replied Mar 7, 2011

Thanks. Yeah I've looked at some S3 solutions, but not yet in depth. I'll have a look and see if I can implement something like in that blogpost to enable RSync with S3 with Backup 3. Thanks!

Please sign in to comment.