How do you guys back up your server?
How do you guys back up your server?
pg_dump rsync’ed to off-site server.
3-2-1
Three copies. The data on your server.
Buy a giant external drive and back up to that.
Off site. Backblaze is very nice
How to get your data around? Free file sync is nice.
Veeeam community version may help you too
I’m not sure how you understand the 3-2-1 rule given how you explained it, even though you’re stating the right stuff (I’m confused about your numbered list…) so just for reference for people reading that, it means that your backups need to be on:
ITT: lots of the usual paranoid overkill. If you do rsync with the –backup switch to a remote box or a VPS, that will cover all bases in the real world. The probability of losing anything is close to 0.
The more serious risk is discovering that something broke 3 weeks ago and the backups were not happening. So you need to make sure you are getting some kind of notification when the script completes successfully.
While I don't agree that using something like restic is overkill you are very right that backup proess monitoring is very overlooked. And recovering with the backup system of your choice is too.
I let my jenkins run the backup jobs as I have it running anyways for development tasks. When a job fails it notifies me immediately via email and I can also manually check in the web ui how the backup went.
For config files, I use tarsnap.
Each server has its own private key, and a /etc/tarsnap.list file which list the files/directories to backup on it. Then a cronjob runs every week to run tarsnap on them. It’s very simple to backup and restore, as your backups are simply tar archives. The only caveat is that you cannot “browse” them without restoring them somewhere, but for config files it’s pretty quick and cheap.
For actual data, I use a combination of rclone and dedup (because I was involved in the project at some point, but it’s similar to Borg). I sync it to backblaze because that’s the cheapest storage I could find. I use dedup to encrypt the backup before sending it to backblaze though. Restoration is very similar to tarsnap:
dup-unpack -k keyfile snapshot-yyyymmdd | tar -C / -x [files..] .The simplicity of containerized setup:
For my webserver, mysqldump to a secured folder, then restic backup the whole /svr folder, then rsync the restic backup to another server. Also have a system that emails me if these things don't happen daily. The log files are uploaded to a url, the log file is checked for simple errors, and if no file is uploaded in time, email.
Of course, in my case, the url files are uploaded to - and the email server... are the same server I'm backing up... but at least if that becomes a problem, I probably only need the backups I've already made to my second server.
Proxmox Backup Server. It’s life-changing. I back up every night and I can’t tell you the number of times I’ve completely messed something up only to revert it in a matter of minutes to the nightly backup. You need a separate machine running it–something that kept me from doing it for the longest time–but it is 100% worth it.
I back that up to Backblaze B2 (using Duplicati currently, but I’m going to switch to Kopia), but thankfully I haven’t had to use that, yet.
You can access ZFS snapshots from the hidden .zfs folder at the root dir of your volume. From there you can restore individual files.
There is also a command line tool (httm) that lists all snapshotted versions of a files and allows you to restore them.
If the snapshot you want to restore from is on a remote machine, you can either send it over or scp/rsync the files from the .zfs directory.
I use Duplicati and backup server to both another PC and the cloud. Unlike a lot of data hoarders I take a pretty minimalist approach to only backing up core (mostly docker) configs and OS installation.
I have media lists but to me all that content is ephemeral and easily re-acquired so I don’t include it.
I am lucky enough to have a second physical location to store a second computer, with effectively free internet access (as long as the data volume is low, under about 1TB/month.)
I use the ZFS file system for my storage pool, so backups are as easy as a few commands in a script triggered every few hours, that takes a ZFS snapshot and tosses it to my second computer via SSH.
It’s kind of broken at the moment, but I have set up duplicity to create encrypted backups to Bacblaze B2 buckets.
Of course the proper way would be to back up to at least 2 more locations. Perhaps a local NAS for starters. Also could be configured in duplicity.