this post was submitted on 10 Aug 2023
43 points (97.8% liked)

Asklemmy

43965 readers
1939 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

I have been using a VPS for a while to host some personal projects and services that I have been using. Lately I have start to think to move all my git projects into it aswell. But at the moment, I'm not really sure how to go about off site backups of the data. How do you usually go about running backups on your servers?

all 29 comments
sorted by: hot top controversial new old
[–] [email protected] 8 points 1 year ago (2 children)

I do not

I recompiled darkplaces dedicated for xonotic(aarch64), overcoming many dependency hells and pitfalls

I would be absolutely devastated to lose it and that would be a net negative to the Indian xonotic community

[–] [email protected] 2 points 1 year ago (1 children)
[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

You're welcome!

I made this since I hated the high ping from the Australian severs

[–] [email protected] 1 points 1 year ago (1 children)

Fair! I run a US based instagib server.

[–] [email protected] 1 points 1 year ago (1 children)

Nice! I might drop in sometime, what's the name?

[–] [email protected] 1 points 1 year ago (1 children)

It's one of the few SMB servers left. SMB Chicago, or something like that.

[–] [email protected] 1 points 1 year ago (1 children)
[–] [email protected] 1 points 1 year ago

often it's dead, people play once in a while. I've seen max 7 concurrent players but at the same time, I got a discord with half of them :P

[–] baatliwala 1 points 1 year ago

Indian xonotic community

How big is the Xonotic community over here anyway?

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (1 children)

Daily backup using Restic to wasabi s3.

Restic already speaks s3 natively, no need to mount it or anything, just point it at a bucket and hand it an api key.

You can use an api key that’s only allowed to read and write, but not delete / modify, so you’ve got some protection from ransomware.

[–] Elferrerito 1 points 1 year ago (1 children)

Thanks for sharing, I dind't knew about restic, I will definetely have a look

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

You can also feed database dumps directly into restic, like this:

mysqldump --defaults-file=/root/backup_scripts/.my.cnf --databases mydatabse | restic backup --stdin --stdin-filename mydatabase.sql

[–] sznowicki 3 points 1 year ago (1 children)

I only choose hosting that provides automated backups of VPS. And it has to be credible like Hetzner who keeps those backups in a different location.

Additionally if I have something really important I do periodical backups to my local mac that has all sort of backup processes (iCloud, Time Machine plus an extra encrypted backup that I keep on well… hetzner)

[–] Elferrerito 1 points 1 year ago (1 children)

To be fair, I'm on Hetzner aswell, but the paranoid side of me, would want to have a fallback in the unlikely case of something happens to the company

[–] sznowicki 1 points 1 year ago

Its a German GmbH. From them announcing bankruptcy to switching off the servers would take some months.

[–] [email protected] 2 points 1 year ago (1 children)

Not using a VPS, but I use β€˜restic’ to backup my servers over SFTP to my NAS.

Works really well, I do daily incremental backups and set it to keep 1 backup a day for the last week, 1 backup a week for the last 4 weeks and 1 backup per month for the last 6 months.

[–] Elferrerito 1 points 1 year ago

Having the backups "in-house" is also something to explore, since this could then become the backup for other services aswell

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

You can use a few tools.

RSync

Rclone - probably want this one over RSync though.

Tarsnap

Duplicati

Restic

There's obviously a lot more, but these are some of the more popular ones.

Now you need a way to back it up. Probably the best way is to tar it up first and then dump that file. You can also get something like deadmans snitch to ensure backups don't break.

As you mentioned, if this is just source code, then the best thing would be to create source control and have it set up that way. Then you automate it and deploy the code when you make updates and have a history of changes.

It sounds like tarsnap is your best bet though. It will be the cheapest.

You can also backup to another storage provider like Google, Dropbox, or even AWS s3. S3 can get costly, but you can archive everything to the glacier tier which is pretty cheap.

[–] Elferrerito 1 points 1 year ago (1 children)

Thank you for the suggestions, I have been planning on moving my git repositories out of GitHub, into something like gitea. You gave me a good starting point to research the available options.

[–] [email protected] 1 points 1 year ago

If you don't want to use a hosted provider, you can at least just start using git. Just do git init. Then you can start commiting changes. This way, you at least have a history of changes. Then just back that folder up like normal

[–] [email protected] 2 points 1 year ago (1 children)

I wrote a bash script that runs daily which 7z (AES256) the databases (well... I dump the DB as text and then 7z those files), web files (mostly WordPress), user files, all of /etc, and generate a list of all installed packages, and then copy the archives to a timestamped folder on my Google drive (I keep the last two nights, plus the last 3 Sundays).

TBH, the zipped content is around 1.5GB for each backup. So my 17GB of free GDrive space more than enough. If I actually had a significant amount of data, I'd look into a more robust long term solution.

If there was a catastrophic failure, it'd take me around six hours to rebuild a new server and test it.

[–] Elferrerito 1 points 1 year ago

That is a good idea, I was thinking on doing something similar with s3 before deciding to check what other people were doing. Thanks

[–] ArmoredCavalry 2 points 1 year ago* (last edited 1 year ago) (1 children)

If it's something mission critical, consider following the 3-2-1 backup rule.

I tend to use whatever built-in snapshot option the service provider offers, and then for off-site backups can use something like Veeam (free for first 10 VMs / machines) - https://www.veeam.com/virtual-machine-backup-solution-free.html

[–] Elferrerito 2 points 1 year ago

Thanks for sharing, I will have a read into this article

[–] eternal_peril 1 points 1 year ago

Restic to backblaze

Inexpensive

Reliable

Easy to recover

[–] [email protected] 1 points 1 year ago

I don’t

[–] [email protected] 1 points 1 year ago

+1 on Restic