this post was submitted on 08 Dec 2023
23 points (96.0% liked)

Programming

17313 readers
335 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 1 year ago
MODERATORS
 

TL;DR No archive format like tar, zip, ... but how would you theoretically represent a symlink in a manner that can be stored on the cloud and retrieved back to the system as a symlink?

Backstory

I heavily use symlinks to organise my media and even wrote an application that helps me do so (it's in Python and being rewritten in Rust). But I also use stuff like home-manager and nix which makes heavy use of symlinks.

My goal is to back up my media and /home to the cloud at regular intervals. There are services that cost just about 60-100€ yearly for limitless storage in the cloud. So having part of my library purely in the cloud and using terrabytes of space would cost less than a single 15TB HDD (500+€). To have a local backup, I'd even need a least a second one, which would put me at >1000€ - the equivalent of at least 10 years of cloud storage.

Options explored

rclone

It is pretty sweet as it supports mounting a cloud drive as a folder and has transparent encryption! However there are multiple open issues on uploading symlinks and I don't know Go. I wouldn't mind trying to learn it if I had an idea how to upload a symlink without following it (following symlinks breaks them).

git-annex etc.

git-annex and using a bare git repo with a remote worktree is great, but I don't need to make diffs of stuff and follow how things moved around, etc. I just need to replace backups with a view of what's there. Plus, storing all that history will probably take enormous amounts of space which is wasteful.

Ideas

store a blob of stat() call for every file

I'm not sure about this. The stat struct does contain information about the filetype (directory, hard link, symlink, ...), but my knowledge of linux internals is limited and maybe that's too complicated for this usecase.

a db of links

Instead of storing the links themselves, I store a DB (sqlite? CSV?) of links, upload that DB and use the DB to restore links after pull it back down. 🤔 Actually this might be the simplest thing to do, but maybe y'all have better ideas.

you are viewing a single comment's thread
view the rest of the comments
[–] cm0002 3 points 11 months ago (3 children)

I have no comment on the syncing problem, but could you lmk what the cheap cloud services are? I'm reaching the end of my rope with Google Workspace drive no longer being unlimited and Dropbox ended their "as much as you need" policy

[–] [email protected] 3 points 11 months ago (1 children)

Jottacloud is what I want to use. Unlimited storage for ~100€/month

Close behind is 1fichier for 2€/TB/month or 12€/TB/year, but they are in France and "uptobox" (a similar provider) was shutdown by the US on French soil because they allowed providing links to the files.

You can probably find others in the list of storage systems supported by rclone

[–] cm0002 2 points 11 months ago

Oh. Yea I've been in the Google Workspace Unlimited alternative thread on the rclone forums, apparently jotta will eventually hit a point on its "gradual slowdown" that it's practically worthless (Iirc it was around 10TB, so for jotta "unlimited" is functionally 10TB)

1ficihier was also talked about, but ig you have to reupload any given file every 30 days or it will expire

load more comments (1 replies)