this post was submitted on 11 Jun 2023
233 points (98.7% liked)

Asklemmy

43893 readers
1098 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

I'm really enjoying lemmy. I think we've got some growing pains in UI/UX and we're missing some key features (like community migration and actual redundancy). But how are we going to collectively pay for this? I saw an (unverified) post that Reddit received 400M dollars from ads last year. Lemmy isn't going to be free. Can someone with actual server experience chime in with some back of the napkin math on how expensive it would be if everyone migrated from Reddit?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] nwithan8 7 points 1 year ago (2 children)

100GB is practically nothing nowadays.

There are people (myself included, not to brag) running home servers with literally hundreds of terabytes of data. At that ~0.3 GB/day number, I alone could host 3,500 years worth of data. Get some of those r/DataHoarders and r/HomeLab guys on here and Lemmy would never run out of space.

[โ€“] [email protected] 4 points 1 year ago* (last edited 1 year ago) (1 children)

Considering Lemmy is absolutely tiny compared to Reddit, these aren't numbers worth considering. Every single instance needs to mirror data, and I still don't understand how this is supposed to scale to something a fraction the size of Reddit unless the federation is just a few enormous instances that can afford that scale. It's not like everyone pitches in what they can -- every single instance individually needs to be able to support the entire dataset and the associated synchronization traffic (for the portion that its users have requested access to).

[โ€“] bizzwell 1 points 1 year ago (1 children)

What if there are a couple large archive mirrors and the posts on other servers have a life expectancy maybe based on time, but also engagement? Crucial posts could be stickied, but I don't see the need for everyone to hold onto everything forever. Even in the event of a catastrophic loss of the archives, the communities could still live on and rebuild.

[โ€“] [email protected] 1 points 1 year ago

Yeah, that's a good point. I imagine there'd have to be some compromise like that for smaller instances. How often do users load up content older than say a couple of weeks or a month? Could be a hinderance on the experience, hard for me to estimate (for myself) how often that happens.

[โ€“] [email protected] 1 points 1 year ago

That is just one instance and that is a small amount of users. Reddit has 430 million active monthly users. Let's say 1% move over to Lemmy. According to the 4 GB per 22 days (with ~1k users per instance) for Lemmy.world, that would mean you would need 1.6 Terrabytes of storage per day to support that 1% of users. Of course this would be spread over a number of instances... but you can start to see where the problem lies....