this post was submitted on 25 Jul 2023
185 points (97.9% liked)

Linux

48372 readers
1436 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

I'm trying to find a good method of making periodic, incremental backups. I assume that the most minimal approach would be to have a Cronjob run rsync periodically, but I'm curious what other solutions may exist.

I'm interested in both command-line, and GUI solutions.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] InverseParallax 2 points 1 year ago

Do most of my work on nfs, with zfs backing on raidz2, send snapshots for offline backup.

Don't have a serious offsite setup yet, but it's coming.

[–] [email protected] 2 points 1 year ago

Github for projects, Syncthing to my NAS for some config files and that's pretty much it, don't care for the rest.

[–] donio 2 points 1 year ago* (last edited 1 year ago)

Restic since 2018, both to locally hosted storage and to remote over ssh. I've "stuff I care about" and "stuff that can be relatively easily replaced" fairly well separated so my filtering rules are not too complicated. I used duplicity for many years before that and afbackup to DLT IV tapes prior to that.

[–] HR_Pufnstuf 2 points 1 year ago (3 children)

ZFS send / recieve and snapshots.

[–] [email protected] 2 points 1 year ago (1 children)

Does this method allow to pick what you need to backup or it's the entire filesystem?

[–] HR_Pufnstuf 2 points 1 year ago

It allows me to copy select datasets inside the pool.

So I can choose rpool/USERDATA/so-n-so_123xu4 for user so-n-so. I can also choose copy copy some or all of the rpool/ROOT/ubuntu_abcdef, and it's nested datasets.

I settle for backing up users and rpool/ROOT/ubuntu_abcdef, ignoring the stuff in var datasets. This gets me my users home, roots home, /opt. Tis all I need. I have snapshots and mirrored m2 ssd's for handling most other problems (which I've not yet had).

The only bugger is /boot (on bpool). Kernel updates grown in there and fill it up, even if you remove them via apt... because snapshots. So I have to be careful to clean it's snapshots.

load more comments (2 replies)
[–] cow 2 points 1 year ago

I use bupstash to backup to a server I built a few years ago

[–] [email protected] 2 points 1 year ago

I use rsync to an external drive, but before that I toyed a bit with pika backup.

I don't automate my backup because i physically connect my drive to perform the task.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Setup

Machine A:

  • RAIDz1 takes care of single-disk failure
  • ZFS doing regular snapshots
  • Syncthing replicates the data off-site to Machine B

Machine B:

  • RAIDz1 takes care of single-disk failure
  • ZFS doing regular snapshots
  • Syncthing receiving data from Machine A

Implications

  • Any single-disk hardware failure on machine A or B results in no data loss
  • Physical destruction of A won't affect B and the other way around
  • Any accidentally deleted or changed file can be recovered from a previous snapshot
  • Any ZFS corruption at A doesn't affect B because send/recv isn't used. The two filesystems are completely independent
  • Any malicious data destruction on A can be recovered from B even if it's replicated via snapshot at B. The reverse is also true. A malicious actor would have to have root access on both A and B in order to destroy the data and the snapshots on both machines to prevent recovery
  • Any data destruction caused by Syncthing can be recovered from snapshot at A or B
[–] [email protected] 1 points 1 year ago (2 children)

Most of my data is backed up to (or just stored on) a VPS in the first instance, and then I backup the VPS to a local NAS daily using rsnapshot (the NAS is just a few old hard drives attached to a Raspberry Pi until I can get something more robust). Very occasionally I'll back the NAS up to a separate drive. I also occasionally backup my laptop directly to a separate hard drive.

Not a particularly robust solution but it gives me some piece of mind. I would like to build a better NAS that can support RAID as I was never able to get it working with the Pi.

[–] [email protected] 1 points 1 year ago

When I do something really dumb I typically just use dd to create an iso. I should probably find something better.

[–] [email protected] 1 points 1 year ago

Restic to Synology nas, Synology software for cloud backup.

[–] [email protected] 1 points 1 year ago

Good ol' fashioned rsync once a day to a remote server with zfs with daily zfs snapshot (rsync.net). Very fast because it only need to send changed/new files, and saved my hide several times when I need to access deleted files or old version of some files from the zfs snapshots.

[–] danielfgom 1 points 1 year ago

Periodic backup to external drive via Deja Dup. Plus, I keep all important docs in Google Drive. All photos are in Google Photos. So it's only my music really which isn't in the cloud. But I might try upload it to Drive as well one day.

[–] ryannathans 1 points 1 year ago

Restic with deja dupe gui

[–] [email protected] 1 points 1 year ago

Vorta + borgbase

The yearly subscription is cheap and fits my storage needs by quite some margin. Gives me peace of mind to have an off-site back up.

I also store my documents on Google Drive.

[–] [email protected] 1 points 1 year ago (1 children)

I use Pika backup, which uses borg backup under the hood. It's pretty good, with amazing documentation. Main issue I have with it is its really finicky and is kind of a pain to setup, even if it "just works" after that.

[–] [email protected] 2 points 1 year ago (1 children)

Can you restore from it? That’s the part I’ve always struggled with?

[–] [email protected] 1 points 1 year ago

The way pika backup handles it, it loads the backup as a folder you can browse. I've used it a few times when hopping distros to copy and paste stuff from my home folder. Not very elegant, but it works and is very intuitive, even if I wish I could just hit a button and reset everything to the snapshot.

[–] anarchyreloaded 1 points 1 year ago (2 children)

I use timeshift. It really is the best. For servers I go with restic.

[–] [email protected] 1 points 1 year ago (1 children)

@anarchyreloaded @Kalcifer
whats is restic? i use linux but no know whats

[–] anarchyreloaded 1 points 1 year ago

A backup solution that creates encrypted snapshots of directories. https://restic.net/

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I use timeshift because it was pre-installed. But I can vouch for it; it works perfectly, and let's you choose and tweak every single thing in a legible user interface!

[–] [email protected] 1 points 1 year ago (1 children)

Anything important I keep in my Dropbox folder, so then I have a copy on my desktop, laptop, and in the cloud.

When I turn off my desktop, I use restic to backup my Dropbox folder to a local external hard drive, and then restic runs again to back up to Wasabi which is a storage service like amazon's S3.

Same exact process for when I turn off my laptop.. except sometimes I don't have my laptop external hd plugged in so that gets skipped.

So that's three local copies, two local backups, and two remote backup storage locations. Not bad.

Changes I might make:

  • add another remote location
  • rotate local physical backup device somewhere (that seems like a lot of work)
  • move to next cloud or seafile instead of Dropbox

I used seafile for a long time but I couldn't keep it up so I switched to Dropbox.

Advice, thoughts welcome.

[–] [email protected] 1 points 1 year ago

I actually move my Documents, Pictures and other important folders inside my Dropbox folder and symlink them back to their original locations

This gives me the same Docs, Pics, etc. folders synced on every computer.

[–] GustavoM 1 points 1 year ago

Either an external hard drive or a pendrive. Just put one of those in a keychain and voila, a perfect backup solution that does not need of internet access.

...it's not dumb if it (still) works. :^)

[–] [email protected] 1 points 1 year ago

I use duplicity to a drive mounted off a Pi for local, tarsnap for remote. Both are command-line tools; tarsnap charges for their servers based on exact usage. (And thanks for the reminder; I'm due for another review of exactly what parts of which drives I'm backing up.)

[–] PhilBro 1 points 1 year ago

I run Openmediavault and I backup using BorgBackup. Super easy to setup, use, and modify

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I've got a smb server setup with a 12tb server drive. Anything important gets put on there

Edit: fixed spelling

[–] UWbadgers16 1 points 1 year ago

I use Timeshift for daily, weekly, monthly rsync backups. Then I create image backups using Clonezilla every month or two. I try to follow the 3-2-1 principal (3 backups, 2 mediums, 1 offsite) - local computer, external drive, Google Cloud.

[–] Trail 1 points 1 year ago

A separate NAS on an atom cpu with btrfs of raid 10 exposed over NFS.

[–] [email protected] 1 points 1 year ago

zfs snap and zfs send to an external or another server.

load more comments
view more: ‹ prev next ›