Prayer
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
You don't have to worry about the backups. It the data recovery that will require divine intervention.
Jesus is my ~~copilot~~ raid parity.
Raid is backup right?
of course /s
It protects against drive failure. That is the threat I am most worried about, so it's fine for me.
That's the thing. I don't.
Two hard drives of the same size, one on site and one off site.
Where do you keep your off-site one? Like a friend or family member's house?
I keep one in a bank deposit box. It costs like $10/year, fireproof, climate controlled, and exactly the right size for a 3.5" disk. Rotate every couple of months, because it is like 10-15 minute process to get into the vault.
So your backed up data can be as old as a couple of months and requires manual interaction? I guess that's better than nothing, but I'm looking for something more automated. I'm not sure what my options are for cloud storage or if they are safe from deletion. Or if having it in a closet in a friends house is really the best option.
I have a live local backup to guard against hardware/system failure. I figure the only reason I'd have to go to the off-site backup is destruction of my home, and if that ever happens then recreating a couple of months worth of critical data will not be an undue burden.
If I had work or consulting product on my home systems, I'd probably keep a cloud backup by daily rsync, but I'm not going to spend the bandwidth to remote backup the whole system off site. It's bad enough bringing down a few tens of gigabytes - sending up several terabytes, even in the background, just isn't practical for me.
At home and at the shop where I work. At work the drives are actually stored in a Faraday cage.
I wrote my own thing. I didn't understand how the standard options worked so I gave up.
Tape is the best medium for archiving data.
I really want to use tape for backups, but holy expensive. Those tape drives are thousands of dollars.
I bought an incredibly overkill tape system a few years ago and then the power supply exploded in it and I never bothered to replace it. Still, definitely worth it
Local to synology. Synology to AWS with synology's backup app. It costs me pennies per day.
Manually plug in a few disks every once in a while and copy the important stuff. Disks are offline for the most part.
I keep important files on my NAS, and use Borgbackup with Borgmagic for backups. I've got a storage VPS with HostHatch that's $10/month for 10TB space (was a special Black Friday deal a few years ago).
Make sure you don't just have one backup copy. If you discover that a file was corrupted three weeks ago, you should be able to restore the file from a three week old backup. rsync and rclone will only give you a single backup. Borg dedupes files across backups so storing months of daily backups often isn't a problem, especially if the files rarely change.
Also make sure that ransomware or an attacker can't mess up your backup. This means it should NOT be mounted as a file system on the client, and ideally the backup system has some way of allowing new backups while disallowing deleting old ones from the client side. Borg's "append only" mode is perfect for this. Even if an attacker were to get onto your client system and try to delete the backups, Borg's append-only mode just marks them as deleted until you run a compact
on the server side, so you can easily recover.
The only type of data I care about is photos and video I’ve taken. Everything else is replaceable.
My phone —> immich —> backblaze b2, and some Google drive.
Linux isos I can always redownload.
I do an automated nightly backup via restic to Backblaze B2. Every month, I manually run a script to copy the latest backup from B2 to two local HDDs that I keep offline. Every half a year I recover the latest backup on my PC to make sure everything works in case I need it. For peace of mind, my automated backup includes a health check through healthchecks.io, so if anything goes wrong, I get a notification.
It's pretty low-maintenance and gives a high degree of resilience:
- A ransomware attack won't affect my local HDDs, so at most I'll lose a month's worth of data.
- A house fire or server failure won't affect B2, so at most I'll lose a day's worth of data.
restic has been very solid, includes encryption out of the box, and I like the simplicity of it. Easily automated with cron etc. Backblaze B2 is one of the cheapest cloud storage providers I could find, an alternative might be Wasabi if you have >1TB of data.
How much are you backing up? Admittedly backblaze looks cheap but at $6 Tb leaves me with $84 pcm or just over $1000 per year.
I'm seriously considering a rpi3 with a couple of external disk in an outbuilding instead of cloud
rclone to dropbox and opendrive for things I care about like photo backups and RAW backups, and an encrypted rclone volume to both for things that need to be backed up, but also kept secure, such as scans of my tax returns, mortgage paperwork, etc. I maintain this script for the actual rclone automation via cron
I sync all my files across 4 different computers in my house (rsync and Nextcloud) and then backups on OneDrive and Google Drive.
Synology NAS where all computers get backed up to locally. Restic for Linux, Time Machine for Mac, active backup for Windows.
NAS backs most of its data (that I trust enough to put on the cloud) encrypted to Google drive every night, occasionally I back the NAS up to an external 8tb hard-drive.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I've seen in this thread:
Fewer Letters | More Letters |
---|---|
ESXi | VMWare virtual machine hypervisor |
Git | Popular version control system, primarily for code |
NAS | Network-Attached Storage |
RAID | Redundant Array of Independent Disks for mass storage |
SSD | Solid State Drive mass storage |
VPS | Virtual Private Server (opposed to shared hosting) |
[Thread #188 for this sub, first seen 5th Oct 2023, 00:05] [FAQ] [Full list] [Contact] [Source code]
I have a Synology NAS that holds all my important data. Then it does nightly backups to Synology C2.
I have a cheap 2 bay synology NAS that acts solely as a backup server for my main NAS in an offsite location as well as a USB drive locally.
Backups run every night with duplicacy
I exclude media files (movies, TV shows,...) from my backup routine due to the sheer amounts of data accumulated over time and the fact that most of it can be re-aquired using public sources in case disaster recovery is needed
Proxmox backs up to pbs and pbs is synced to B2 with rclone.
Other stuff is restic to b2.
Device sync to nextcloud -> rsync data & db onto NAS -> nightly backup to rsync.net and quarterly offsite/offline HDD swaps.
I also copy Zoneminder recordings, configs, some server logs, and my main machine’s ~/ onto the NAS.
The offsite HDD is just a bog standard USB 4TB drive with one big LUKS2 volume on it.
It’s all relatively simple. It’s easy to complicate your backups to the point where you rely on Veeam checkpointing your ESXI disks and replicating incrementals to another device that puts them all back together… but it’s much better to have a system that’s simple and just works.
I backup my ESXi VMs and NAS file shares to local server storage using an encrypted Veeam job and have a copy job to a local NAS with iSCSI storage presented.
From there I have another host VM accessing that same iSCSI share uploading the encrypted backup to Backblaze. Unlimited "local" storage for $70\y? Yes please! (iSCSI appears local to Backblaze. They know and have already started they don't care.)
I'm backing up about 4TB to them currently using this method.
I do exactly the same. I do not have a lot of data I feel a need to backup. I have a nightly job that zips and then encrypts my data, then rclones it to off site storage.
I've finally settled on Duplicacy. I've tried several CLI tools, messed with and love rclone, tried the other GUI backup tools, circled back to Duplicacy.
I run a weekly app data backup of my unRAID docker containers, which is stored on an external SSD attacked via USB to the server. The day after that runs duplicacy does a backup of that folder to Backblaze B2. My Immich library is backed up nightly and also sent to B2 via Duplicacy. Currently, those are the only bits of critical data on the server. I will add more as I finalize a client backup for the Win10, Linux, and MacOS devices at home, but it will follow this trend.
Various HDD full data backups maintained with FreeFileSync, important files backup on ProtonDrive. Multi-device autosync with Syncthing (phones, tablet, pcs)
I miss back in the day. Used to be able to store all my stuff on CD-R's, hell before that it was floppy's. File sizes have grown exponentially, programs/apps all have huge sizes. Pictures and videos is my biggest issue, but I'd also like to backup games that I've downloaded so I don't have to download again. I can backup old games no problem, but modern games? Many are 100+ GB now, and in time they all will be and 200GB will be the standard, then a terabyte and more.
Anyway, until I can afford and find a 20 tb sad I'm just using DVDs for everything but games and large programs. Quick to write, solid, tangeable etc. If I could afford a bunch of flash drives I'd probably do that instead.
If you can afford it and it's important data I'd ofc recommend backing up to a large SSD, THEN to a cloud (or more) as a failsafe.. then also using flash drives/DVD's etc. For an additional failsafe for the super important stuff.
I mean, if it's important backup all you can.
I've got priceless memories in my Google photos library but ofc Google removed being able to view them on my native photos app and download easily.. so instead I either have to backup and save ALL of it in Google drive or download specific albums.. idk so I wouldn't personally recommend google as a true backup as you never know, personally I'd just use DVDs and flash drives for that stuff
I have two machines that back up to a local server using Borg. That whole server in turn backs up to Jottacloud using restic with encryption enabled.
By the way, I wouldn't use rclone for backups. Use restic or something similar that does incremental backups. Because if you do rclone and then later discover that some files were corrupted locally, then your files are gone. With incremental backups you would still be able to retrieve them.
Oh, or do you mean backing up the stuff that is on the cloud?
I do an s3 sync every five minutes of my important files to a versioned bucket in AWS, with S3-IA and glacier instant retrieval policies, depending on directory. This also doubles as my Dropbox replacement, and I use S3 explorer to view/sync from my phone.
Everything to Crashplan.
Critical data also goes to Tarsnap.
I use borg
restic to Wasabi.