this post was submitted on 31 Jul 2023
60 points (98.4% liked)

Linux for Leftists

138 readers
1 users here now

A Community for all leftists wanting to join and being part of a community that talks about Linux, Unix and the Free Software Community

founded 3 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] [email protected] 17 points 1 year ago* (last edited 1 year ago) (2 children)

The original google document was so vaguely written that I just assumed the smartasses were pretending they invented SSL certificates. Only now did I get it that it's for the server to verify the client, not the other way around, and that's a on a whole new level of absurd. Boy I sure love the idea of having anti-cheat for my browsers. The comments on github are fun to read, at the very least.

That, coupled with Manifest V3 preventing adblockers, is way more reason than one needs to completely forget Google as anything more than the video hosting service for Invidious.

From a github user kescherCode:

I believe that whenever possible, we shall implement this spec into our own websites - but reversed. Whenever the attestation passes, let users not access your site. When it fails or this proposed API is unavailable - let users access your site.

Edit: the mad lad actually started implementing it here.

[–] [email protected] 7 points 1 year ago

Haha what a hero! Hats of to him that's great XD

[–] [email protected] 6 points 1 year ago

WebKit also won’t comment because Web Environment Integrity spec is a personal repo instead of a standards group

[–] doppelgangmember 7 points 1 year ago (3 children)

Just made the switch back to DuckDuckGo

The Google results are atrocious. Fkin 7 "sponsored" results. Two pages almost of fluff

[–] [email protected] 10 points 1 year ago* (last edited 1 year ago) (1 children)

DuckDuckGo is only a meta search engine, they don't have their own index. They use bing's indexes and thus have very similar results, which are more often than not ~~terrible~~ ~~crap~~ absolute dogshit, SEO manipulated and completely unhelpful and you need to sift through them.

I would know, I use it on the daily and am currently looking for an alternative.

Searx looks most promising but I haven't really dived into it. Well... Ever.

And DuckDuckGo still has me somewhat won over because html.duckduckgo.com exists and something that I can call peak web design (no joke I truly love it): lite.duckduckgo.com

[–] [email protected] 4 points 1 year ago

SearX or even better, SearXNG is probably the best option right now. But it is still just a metasearch engine. Unfortunately, currently there is no real FOSS option with its own index that is usable.

[–] [email protected] 7 points 1 year ago (1 children)

Unless they reversed it, DuckDuckGo censors Russian news now. It's why I moved to Brave, but that lacks many features so honestly I'm currently without any satisfactory search engine.

[–] doppelgangmember 6 points 1 year ago* (last edited 1 year ago) (1 children)

i used to like Brave

then i realized they're a private for profit company that gets revenue from ads and selling data

Brave Browser Under Fire For Alleged Sale Of Copyrighted Data

[–] [email protected] 5 points 1 year ago (3 children)
[–] doppelgangmember 5 points 1 year ago

interesting

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

LibreX works similarly and several instances allow TOR and I2P.
Runs with no JavaScript!

[–] [email protected] 3 points 1 year ago

nice, I wasn't aware of that one

[–] [email protected] 3 points 1 year ago

Does anyone else have problems with SearXNG/LibreX instances? I can’t get them to show results from the search engines lately.

[–] [email protected] 7 points 1 year ago (1 children)

I dropped google almost entirely later last year when every search of mine returned with a dozen "Untitled" fake pages with spam text and possibly malware due to some wordpress bug. How can the main product of a trillion dollar company fail so badly at such basic search engine manipulation?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Because it's not "failing", the Internet is fragmenting off into large factions. Step 1 has been capitalism commodifying information, resulting in most information taking one of two forms: siloed off into private databases or SEO sites designed to fish people into siloed off sites. Step 2 is now taking the form of political factions as the world becomes multi-polar again (China, USA and Russia, essentially). Large super powers push their domination in the form of information and each claiming they're right so you don't need to worry about all that other stuff.

The concept of the Internet being global is dead. The concept of the Internet being content created by regular people for regular people is dead. Search engines are just showing us this with their results. Which is why sites like duckduckgo aren't even enough at this point. We are, in effect, just entering a "dark age" of the Internet, where we can expect whole sections to simply not be accessible to us anymore.

To build an entirely new search engine, with broad indexing and showing us real results again that are relevant, you'd need massive amounts of capital. You can't get that capital from private investment because there's no profit in doing this. You can't get that from public funds because it's not in the interest of governments to go against their chosen hegemony. You'd have to grass-roots the effort, which would be a very precarious endeavor that, even if it launched (which would be a miracle), it would be difficult to maintain. Even models like Wikimedia aren't actually sustainable. Wikipedia ultimately curtails to the Western world, as it's moderators and editors largely make up the same hegemonic viewpoint. And Wikimedia doesn't have to do indexing of the entire web, which, at minimum, requires substantially more compute power than they currently use.

EDIT: It's also worth mentioning at this point that a project to "fix the Internet" is probably low on the list of things to "fix" right now. So any leftists taking on the challenge should be aware that they're spending gobs of resources and time on something that only the upper-tier of individuals care about in this world. Time and money would be better spent towards humanitarian aid as people get displaced due to climate change. Or in funding revolutions for areas of the world that are ready for it. By no means am I suggesting we should do a Google rewrite. I'm simply pointing out that everything we should have predicted capitalism doing to information is coming to pass and while things are shitty, they'll definitely get shittier and we shouldn't be surprised. Nor should we really care much, IMO. Because the core issue is just the contradictions within capitalism manifesting themselves into the Internet. Trying to fix the Internet is working backwards and, frankly, will probably be a fruitless endeavor. We Marxian-read leftists should know better.