This blog has moved to Medium

Subscribe via email


Posts tagged ‘s3’

Easy linux backup with duplicity

After looking at several backup solutions for a linux-based project of mine, ranging from more advanced systems like bacula to a range of python/bash scripts, I finally found the ultra simple duplicity. Don’t get me wrong, bacula and its brethren really seem more powerful overall, but when you just need a simple solution, you can’t compete with duplicity’s simplicity.

duplicity is drop-dead simple backup system that does incremental/full backups, and supports a wide variety of backup targets from the box, including file, ftp, scp, s3, and a dozen others. The usage is simply

duplicity /folder/to/backup file:///path/to/target/folder

You can have the target folder be a local or S3-mounted folder (with the excellent s3fs), or use other supported URIs. The default is an incremental backup, by you can override by choosing a full backup. It even has support for “keep the last N full backups and delete the rest”. Update – after running some time with backups into s3fs, I recommend against it – it gave us some performance headaches. Use the native s3 option that duplicity has instead.

The only feature that I required, and duplicity didn’t have, is automatically choosing when to do full vs automatic backups. I would like to tell it “every 100 runs, do a full backup”, and this is not supported out of the box … although easily fixed with a wrapper script. Put that in crontab and you’re set. Oh, by default it requires setting a PGP key, but if you’re lazy you can skip it with the –no-enc option.