this post was submitted on 13 Jun 2023
120 points (96.9% liked)

Selfhosted

51229 readers
808 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

I see many posts asking about what other lemmings are hosting, but I'm curious about your backups.

I'm using duplicity myself, but I'm considering switching to borgbackup when 2.0 is stable. I've had some problems with duplicity. Mainly the initial sync took incredibly long and once a few directories got corrupted (could not get decrypted by gpg anymore).

I run a daily incremental backup and send the encrypted diffs to a cloud storage box. I also use SyncThing to share some files between my phone and other devices, so those get picked up by duplicity on those devices.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 2 years ago

My critical files and folders are synced from my mas to my desktop using syncthing. From there I use backblaze to do a full desktop backup nightly.

My Nas is in raid 5, but that's technically not a backup.

[–] [email protected] 3 points 2 years ago

Personally I do:

  • Daily snapshots of my data + Daily restic backup on-site on a different machine
  • Daily VM/containers snapshot locally and on a different machine, keeping at least 2 monthly, 2 weekly and 2 daily backups
  • Weekly incremental data backup in an immutable B2 bucket, with a new bucket every month and a 6 month immutability (so data can't be changed/erased for 6 month)
  • Weekly incremental data backup on an other off-site machine
  • Monthly (but I should start doing it weekly) backup of important data (mainly documents and photos) on removable medias that I keep offline in a fire-proof safe

Maybe it's overkill, maybe it's not enough, I'll know when something fail and I am screwed, ahah

As a note, everybody should test/check their backup frequently. I once had an issue after changing an IP address and figured out half my backups where not working 6 month later...

[–] [email protected] 3 points 2 years ago

I use.... Timeshift ->Local backup on to my RAID array borgbackup -> borgbase online backup GlusterFS -> experimenting with replicating certain apps across 2 raspberry pi's

[–] [email protected] 2 points 2 years ago

I backup an encrypted and heavily compressed archive to my local nas and to google drive every night. NAS keeps the version from the first of every month and 7 days prior history and google drive just the latest

[–] [email protected] 2 points 2 years ago

Nextcloud with folder sync for both mobile and PC, backs up everything I need.

[–] [email protected] 2 points 2 years ago

I just use duplicity and upload to Google drive.

[–] [email protected] 2 points 2 years ago (2 children)

In the process of moving stuff over to Backblaze. Home PCs, few clients PCs, client websites all pointing at it now, happy with the service and price. Two unraid instances push the most important data to an azure storage a/c - but imagine i'll move that to BB soon as well.
Docker backups are similar to post above, tarball the whole thing weekly as a get out of jail card - this is not ideal but works for now until i can give it some more attention.

*i have no link to BB other than being a customer who wanted to reduce reliance on scripts and move stuff out of azure for cost reasons.

load more comments (2 replies)
[–] [email protected] 2 points 2 years ago (1 children)

I don't backup my personal files since they are all more or less contained in Proton Drive. I do run a handful of small databases, which i back up to ... telegram.

[–] [email protected] 2 points 2 years ago (1 children)

Ah, yes, the ole' "backup a database to telegram" trick. Who hasn't used that one?!?

[–] [email protected] 2 points 2 years ago (1 children)

I did. Split pgp tarball into 2gb files and download 600gb to saved messages

[–] [email protected] 2 points 2 years ago

It's just a matter of time when Telegram will crack down on this and limit the amount of cloud Storage used. But until then, I'll happily use Telegram as a fourth backup

[–] [email protected] 2 points 2 years ago (1 children)

Rsync script that does deltas per day using hardlinks. Found on the Arch wiki. Works like a charm.

load more comments (1 replies)
[–] [email protected] 2 points 2 years ago (2 children)

For my server I use duplicity, with a daily incremental backup and sending the encrypted diffs away. I researched a few more options some time ago but nothing really fit my use case, but I'm also not super happy with duplicity. Thanks for suggesting borgbackup.

For my personal data I have a NextCloud on a RPi4 at my parents' place, which also syncs between my laptop that I've left there. For an offline and off-site storage, I use the good old strategy where I bring over an external hard drive, rsync it, and bring it back.

[–] [email protected] 2 points 2 years ago

I feel the exact same. I've been using Duplicacy for a couple years, it works, but don't totally love it.

When I researched Borg, Restic, others, there were issues holding me back for each. Many are CLI-driven, which I don't mind for most tools. But when shit hits the fan and I need to restore, I really want to have a UI to make it simple (and easily browse file directories).

load more comments (1 replies)
[–] [email protected] 2 points 2 years ago

Got a Veeam community instance running on each of my VMware nodes, backing up 9-10 VMs each.

Using Cloudberry for my desktop, laptop and a couple Windows VMs.

Borg for non-VMware Linux servers/VMs, including my WSL instances, game/AI baremetal rig, and some Proxmox VMs I've got hosted with a friend.

Each backup agent dumps its backups into a share on my nas, which then has a cron task to do weekly uploads to GDrive. I also manually do a monthly copy to an HDD and store it off-site with a friend.

[–] [email protected] 2 points 2 years ago

btrfs and btrbk work very well, tutorial: https://mutschler.dev/linux/fedora-btrfs-35/

[–] [email protected] 2 points 2 years ago

For PCs, Daily incremental backups to local storage, daily syncs to my main unRAID server, and weekly off-site copies to a raspberry pi with a large external HDD running at a family member's place. The unRAID server itself has it's config backed up to the unRAID servers and all the local docker stores also to the off-site pi. The most important stuff (pictures, recovery phrases, etc) is further backed up in Google drive.

[–] [email protected] 2 points 2 years ago

My important data is backed up via Synology DSM Hyper backup to:

  • Local external HDD attached via USB.
  • Remote to backblaze (costs about $1/month for ~100gb of data)

I also have proxmox backup server backup all the VM/CTs every few hours to the same external HDD used above, however these backups aren't crucial, it would just be helpful to rebuild if something went down.

[–] [email protected] 2 points 2 years ago

In short: crontab, rsync, a local and a remote raspberry pi and cryptfs on usb-sticks.

[–] [email protected] 1 points 2 years ago

On my home network, devices are backed up using Time Machine over the network. I also use Backblaze to make a second backup of data to their cloud service, using my own private key. Lastly, I throw some backups on a USB drive that I keep in a fire safe.

[–] [email protected] 1 points 2 years ago

I realized at one point that the amount of data that is truly irreplaceable to me amounts to only - 500GB. So for this important data I back up to my NAS, then from there backup to Backblaze. I also create M-Discs. Two sets, one for home and one I keep at a fiends’ place. Then because “why not” and I already had them sitting around also backup two sd cards and keep them on site and off site.

I also backup my other data like tv/movies/music/etc but the sheer volume of data gives me one option, that being a couple usb hard drives I back up to from my NAS.

[–] [email protected] 1 points 2 years ago

Veeam community for me. Cross backup locally between my 2 servers at home, and then a copy job to an offsite NAS.

Have had to restorations before, and never had any issues.

[–] [email protected] 1 points 2 years ago* (last edited 2 years ago)

For smaller backups <10GB ea. I run a 3 phased approach

  • rsync to a local folder /srv/backup/
  • rsync that to a remote nas
  • rclone that to a b2 bucket

These scripts run on the cron service and I log this info out to a file using --log-file option for rsync/rclone so I can do spot checks of the results

This way I have access to the data locally if the network is down, remotely on a different networked machine for any other device that can browse it, and finally an offsite cloud backup.

Doing this setup manually through rsync/rclone has been important to get the domain knowledge to think about the overall process; scheduling multiple backups at different times overnight to not overload the drive and network, ensuring versioning is stored for files that might require it and ensuring I am not using too many api calls for B2.

For large media backups >200GB I only use the rclone script and set it to run for 3hrs every night after all the more important backups are finished. Its not important I get it done asap but a steady drip of any changes up to b2 matters more.

My next steps is to maybe figure out a process to email the backup logs every so often or look into a full application to take over with better error catching capabilities.

For any service/process that has a backup this way I try and document a spot testing process to confirmed it works every 6months:

  • For my important documents I will add an entry to my keepass db, run the backup, navigate to the cloud service and download the new version of the db and confirm the recently added entry is present.
  • For an application I will run through a restore process and confirm certain config or data is present in the newly deployed app. This also forces me to have a fast restore script I can follow for any app if I need to do this every 6months.
[–] [email protected] 1 points 2 years ago

As I have all my data on my homeserver in VMs it’s currently only daily backups to the NAS with proxmox, but I should really add some remote NAS to have it backed up in case my local NAS breaks down.

[–] [email protected] 1 points 2 years ago

Fuck it, we ball.

[–] [email protected] 1 points 2 years ago

My important data is backed up via Synology DSM Hyper backup to:

  • Local external HDD attached via USB.
  • Remote to backblaze (costs about $1/month for ~100gb of data)

I also have proxmox backup server backup all the VM/CTs every few hours to the same external HDD used above, however these backups aren't crucial, it would just be helpful to rebuild if something went down.

load more comments
view more: ‹ prev next ›