this post was submitted on 04 Jul 2023
161 points (98.8% liked)

Selfhosted

40346 readers
419 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I have a home server that I’m using and hosting files on it. I’m worried about it breaking and loosing access to the files. So what method do you use to backup everything?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 1 year ago (1 children)

A simple script using duplicity to FTP data on my private website with infinite storage. I can't say if it's good or not. It's my first time doing it.

[–] stormcynk 1 points 1 year ago (2 children)

How do you have infinite storage? Gsuite?

[–] [email protected] 2 points 1 year ago

Some hosting sites advertise "unlimited" storage, but the fine print generally excludes "abusive users" from this policy. For web hosting, they'd probably consider backups of non-website data to a service intended for basic web hosting to be abuse.

Unless they have a home lab (sounds like they don't) or fancy expensive contract with a large cloud provider (unlikely), this is asking for trouble. Nobody offers unconditional data storage for free, its always a loss leader for another service and abuse will eventually get you banned.

[–] [email protected] 1 points 1 year ago

I confirm that in the terms and condition they discourage the use as a private cloud backup and only to host stuff related to the website. Now.. until now I've had no complaints as I've been paying and kept the traffic at minimum. I guess I'll have to switch to some more cloud oriented version if I keep expanding. But it's worked for now !

[–] [email protected] 2 points 1 year ago

Proxmox backs up the VMs -> backups are uploaded to the cloud.

[–] [email protected] 2 points 1 year ago

I run everything in containers, so I rsync my entire docker directory to my NAS, which in turn backs it up to the cloud.

[–] Pika 2 points 1 year ago

I use Bacula to an external drive, it was a pain in the ass to configure but once it's running its super reliable and easily extended to other drives or folders

[–] [email protected] 2 points 1 year ago

Cronjobs and rclone have been enough for me for the past year or so. Interestingly, I've only needed to restore from a backup once after a broken update. It felt great fixing that problem so easily.

[–] TheWoozy 2 points 1 year ago

I have 2 servers that backup to each other. I also use B2 for photos and important stuff.

[–] Richard 2 points 1 year ago* (last edited 1 year ago)

My home servers a windows box so I use Backblaze which has unlimited storage for a reasonable fixed price. Have around 11TB backed up. Pay the extra few dollars for the extended 12 month retention of deleted files, which has saved me a few times when I needed to restore a file I couldn’t find.

Locally I run stablebit DrivePool and content is mirrored and pooled using that, which covers me for drive failures.

[–] mosjek 2 points 1 year ago

My server uses zfs, which allows me to create regular snapshots with sanoid. This makes it extremly easy to quickly recover individual services or vms without consuming a lot of disk space. In case the server is not recoverable, I still send the incremental snapshots to a pi clone with a large hard drive. If you use the native disk encryption, the snapshot can be sent encrypted without the second server having access to the data.This solution with zfs and sanoid/syncoid has often made my life easier and, in my experience, uses less bandwidth and cpu load.

[–] [email protected] 2 points 1 year ago

I've recently begun using duplicati to backup the data from my docker containers and VMware snapshots for the guest VM itself, just currently struggling to understand how to automate the snapshots yet so I do them manually

[–] GammaScorpii 2 points 1 year ago (1 children)

TrueNAS zfs snapshots, and then a weekly Cron rsync to a servarica VPS with unlimited expanding storage.

[–] [email protected] 2 points 1 year ago (1 children)

If you use a VPS as a backup target, you can also format it with ZFS and use replication. Sending snapshots is faster than using file-level backup tool, especially with a lot of small files.

[–] GammaScorpii 1 points 1 year ago (1 children)

Interesting, I have noticed it's very slow with initial backups. So snapshot replication sends one large file? What if you want to recover individual files?

[–] [email protected] 1 points 1 year ago

You can access ZFS snapshots from the hidden .zfs folder at the root dir of your volume. From there you can restore individual files.

There is also a command line tool (httm) that lists all snapshotted versions of a files and allows you to restore them.

If the snapshot you want to restore from is on a remote machine, you can either send it over or scp/rsync the files from the .zfs directory.

[–] [email protected] 2 points 1 year ago

Almost all the services I host run in docker container (or userland systemd services). What I back up are sqlite databases containing the config or plain data. Every day, my NAS rsyncs the db from my server onto its local storage, and I have Hyper Backup backup the backups into an encrypted S3 bucket. HB keeps the last n versions, and manages their lifecycle. It's all pretty handy!

[–] [email protected] 2 points 1 year ago

I have an rsync script that pulls a backup every night from my truenas server to my Synology.

I've been thinking about setting up something with rsync.net so I have a cloud copy of my most important files.

[–] [email protected] 2 points 1 year ago

btrfs send/receive to my NAS.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

ZFS array using striping and parity. Daily snapshots get backed up to another machine on the network. 2 external hard drives with mirrors of the backup rotate between my home and office weekly-ish.

I can lose 2 hard drives from the array at the same time without suffering data loss. Any accidentally deleted files can be restored from a snapshot if my house is hit by a meteor I lose maximum of 3-4 days of snapshots.

[–] [email protected] 1 points 1 year ago

Compressed pg_dump rsync’ed to off-site server.

[–] netburnr 1 points 1 year ago

Veeam backup and recovery notnfor retail license covers up to 10 workloads. I then s3 offsite to backblaze

[–] [email protected] 1 points 1 year ago

Bash scripting and rclone personally, here is a video that helps https://youtu.be/wUXSLmGAtgQ

[–] shrugal 1 points 1 year ago* (last edited 1 year ago)

My server is a DiskStation, so I use HyperBackup to do an encrypted backup of the important data to their Synology C2 service every night.

[–] [email protected] 1 points 1 year ago

If you are using kubernetes, you can use longhorn to provision PVCs. It offers easy S3 backup along with snapshots. It has saved me a few times.

[–] [email protected] 1 points 1 year ago

dont overthink it.. servers/workstations rsync to a nas, then sync that nas to another nas offsite.

[–] [email protected] 1 points 1 year ago

It’s kind of broken at the moment, but I have set up duplicity to create encrypted backups to Bacblaze B2 buckets.

Of course the proper way would be to back up to at least 2 more locations. Perhaps a local NAS for starters. Also could be configured in duplicity.

[–] [email protected] 1 points 1 year ago

I backup using a simple rsync script to a Hetzner storage box.

[–] [email protected] 1 points 1 year ago

Zfs z2 pool . Not a perfect backup, but it covers disk failure (already lost one disk with no data loss), and accidental file deletion. I'm vulnerable to my house burning down, but overall I sleep well enough.

[–] rambos 1 points 1 year ago
  • kopia backup to 2nd disk
  • kopia backup to B2 cloud
  • duplicaty backup to google drive (only most important folder <1GB)

Most of the files are actually nextcloud so I get one more copy of files (not backup) on PC by syncing with nextcloud app

[–] netburnr 1 points 1 year ago

Veeam backup and recovery notnfor retail license covers up to 10 workloads. I then s3 offsite to backblaze

load more comments
view more: ‹ prev next ›