It's completely unsustainable though. So much new content is being generated by LLMs that other LLMs are training on it. When that happens, you get model collapse. It's already started happening in limited cases, there's an interesting paper on it here with examples. They need a fresh feed of data to learn and train on, and reddit was arguably the world's best source, otherwise it's bots learning from other bots.
I 100% understand Reddit's desire to stop LLM makers from exploiting user data for free, we all should, but they've thrown the baby out with the bathwater and gone full late stage capitalist. On one hand saying that content is the most valuable thing, so restricting access to it, whilst simultaneously making it harder for those who generate and moderate said content for free to do so.
Yeah this is the thing. I would have happily paid it before spez revealed himself to be an irredeemable piece of shit. Now, I've no interest in filling his coffers. Policy needs to change and he needs to go, no negotiation, I don't trust him and I don't think he's a good steward for the site.