darkevilmac

joined 1 year ago
[–] [email protected] 7 points 1 year ago (1 children)

Yes and no, I've worked on the backend for big apps before. You generally try and keep backwards compatibility as long as possible to give clients time to update. You can't just change API routes and have all the clients be on the latest version overnight.

[–] [email protected] 11 points 1 year ago

Excel, is that you?

[–] [email protected] 38 points 1 year ago (2 children)

Nice try dentist

[–] [email protected] 2 points 1 year ago

Rendering with JS definitely makes a difference, it's part of the reason SSR is such a big deal for SEO.

[–] [email protected] 30 points 1 year ago* (last edited 1 year ago) (1 children)

It's all well and good to have a revolution, but if nobody knows you're having one then nothing really changes. There are still benefits to centralised services, one of which being scale. To effectively index so much data you need scale, which is why smaller search engines tend to be just white labels of things like Bing.

[–] [email protected] 3 points 1 year ago

Maybe, I'm a bit more optimistic though. I think even if they just did something like a read only service that pulls from other federated sources like their web crawlers do for regular sites they would basically be done.

The only concern there would be people trying to block them like everyone has been doing to Meta.

[–] [email protected] 74 points 1 year ago (26 children)

I feel like Google is going to have to find a way to effectively index federated content at some point. The only way to really get human information is from sites like Reddit and Twitter. And both of those platforms seem to be dedicated to completely imploding at the moment.

[–] [email protected] 17 points 1 year ago (2 children)

Don't forget poetry!