this post was submitted on 22 May 2024
349 points (91.4% liked)

Technology

60004 readers
3074 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 247 points 7 months ago (7 children)

The moment word was that Reddit (and now Stackoverflow) were tightening APIs to then sell our conversations to AI was when the game was given away. And I'm sure there were moments or clues before that.

This was when the "you're the product if its free" arrangement metastasised into "you're a data farming serf for a feudal digital overlord whether you pay or not".

Google search transitioning from Good search engine for the internet -> Bad search engine serving SEO crap and ads -> Just use our AI and forget about the internet is more of the same. That their search engine is dominated by SEO and Ads is part of it ... the internet, IE other people's content isn't valuable any more, not with any sovereignty or dignity, least of all the kind envisioned in the ideals of the internet.

The goal now is to be the new internet, where you can bet your ass that there will not be any Tim Berners-Lee open sourcing this. Instead, the internet that we all made is now a feudal landscape on which we all technically "live" and in which we all technically produce content, but which is now all owned, governed and consumed by big tech for their own profits.


I recall back around the start of YouTube, which IIRC was the first hype moment for the internet after the dotcom crash, there was talk about what structures would emerge on the internet ... whether new structures would be created or whether older economic structures would impose themselves and colonise the space. I wasn't thinking too hard at the time, but it seemed intuitive to that older structures would at least try very hard to impose themselves.

But I never thought anything like this would happen. That the cloud, search/google, mega platforms and AI would swallow the whole thing up.

[–] [email protected] 47 points 7 months ago

Well that's a happy note on which to end this day

(Well written though, thank you)

[–] [email protected] 27 points 7 months ago (3 children)

Especially coming from Google, who was one of the good guys pushing open standards and interoperability.

[–] [email protected] 37 points 7 months ago

Power corrupts. Decentralize.

[–] [email protected] 10 points 7 months ago

Eh, open-sourcing is just good business, the only reason every big tech company doesn't is that loads of executives are stuck in the past. Of course having random people on the internet do labor for you for free is something Google would want. They get the advantage of tens of thousands of extra eyes on their code pointing out potential security vulnerabilities and they can just put all the really shady shit in proprietary blobs like Google Play Services, they're getting the best of both worlds as far as they're concerned.

Large publicly-traded companies do not do anything for the good of anyone but themselves, they are literally Legally Obligated to make the most profitable decisions for themselves at all times. If they're open-sourcing things it's to make money, not because they were "good guys".

load more comments (1 replies)
[–] Hoxton 21 points 7 months ago (6 children)

Well said! I’m still wondering what happens when the enviable ouroboros of AI content referencing AI content referencing AI content makes the whole internet a self perpetuating mess of unreadable content and makes anything of value these companies once gained basically useless.

Would that eventually result in fresh, actual human created content only coming from social media? I guess clauses about using your likeness will be popping up in TikTok at some point (if they aren’t already)

[–] [email protected] 10 points 7 months ago* (last edited 7 months ago) (2 children)

I dunno, my feeling is that even if the hype dies down we’re not going back. Like a real transition has happened just like when Facebook took off.

Humans will still be in the loop through their prompts and various other bits and pieces and platforms (Reddit is still huge) … while we may just adjust to the new standard in the same way that many reported an inability to do deep reading after becoming regular internet users.

[–] [email protected] 8 points 7 months ago (4 children)

I think it'll end up like Facebook (the social media platform, not the company). Eventually you'll hit model collapse for new models trained off uncurated internet data once a critical portion of all online posts are made by AI, and it'll become Much more expensive to create quality, up-to-date datasets for new models. Older/less tech literate people will stay on the big, AI-dominated platforms getting their brains melted by increasingly compelling, individually-tailored AI propaganda and everyone else will move to newer, less enshittified platforms until the cycle repeats.

Maybe we'll see an increase in discord/matrix style chatroom type social media, since it's easier to curate those and be relatively confident everyone in a particular server is human. I also think most current fediverse platforms are also marginally more resistant to AI bots because individual servers can have an application process that verifies your humanity, and then defederate from instances that don't do that.

Basically anything that can segment the Unceasing Firehose of traffic on the big social media platforms into smaller chunks that can be more effectively moderated, ideally by volunteers because a large tech company would probably just automate moderation and then you're back at square 1.

load more comments (4 replies)
load more comments (1 replies)
load more comments (5 replies)
load more comments (4 replies)
[–] NeoNachtwaechter 210 points 7 months ago* (last edited 7 months ago) (14 children)

"AGI is going to create tremendous wealth. And if that wealth is distributed—even if it’s not equitably distributed, but the closer it is to equitable distribution, it’s going to make everyone incredibly wealthy.”

So delusional.

Do they think that their AI will actually dig the cobalt from the mines, or will the AI simply be the one who sends the children in there to do the digging?

[–] lanolinoil 35 points 7 months ago (38 children)

It will design the machines to build the autonomous robots that mine the cobalt.... doing the jobs of several companies at one time and either freeing up several people to pursue leisure or the arts or starve to death from being abandoned by society.

[–] riodoro1 70 points 7 months ago (1 children)

Have you seen the real fucking world?

It’s gonna make the rich richer and the poor poorer. At least until the gilded age passes.

[–] lanolinoil 14 points 7 months ago

I agree and I gave that option as the last one in the list.

[–] [email protected] 11 points 7 months ago (24 children)

AI absolutely will not design machines.

It may be used within strict parameters to improve the speed of theoretically testing types of bearing or hinge or alloys or something to predict which ones would perform best under stress testing - prior to acutal testing to eliminate low-hanging fruit, but it will absolutely not generate a new idea for a machine because it can't generate new ideas.

load more comments (24 replies)
load more comments (36 replies)
[–] Passerby6497 33 points 7 months ago

if

This word is like Atlas, holding up the world's shittiest argument that anyone with 3 working braincells can see through.

[–] [email protected] 24 points 7 months ago

it isn‘t delusional, it is a lie

[–] foggy 15 points 7 months ago

It's a big year in robotics, so, the former.

[–] [email protected] 9 points 7 months ago

They just mean "steal from the weaker ones" by "create".

Psychology of advertising a Ponzi scheme.

They say "we are going to rob someone and if you participate, you'll get a cut", but change a few things so that people would understand, but would think that someone else won't and will be the fool to get robbed. Then those people considering themselves smart find out that, well, they've been robbed.

Humans are very eager to participate in that when they think it's all legal and they won't get caught.

The idea here is that the "AI" will help some people own others and it's better to be on the side of companies doing it.

I generally dislike our timeline in the fact that while dishonorable people are weaker than honorable people long term, it really sucks to live near a lot of dishonorable people who want to check this again the most direct way. It sucks even more when that's the whole world in such a situation.

load more comments (9 replies)
[–] elias_griffin 78 points 7 months ago (7 children)

Quote from the subtitle of the article

and you can’t stop it.

Don't ever let life-deprived, perspective-bubble wearing, uncompassiontate, power hungry manipulators, "News" people, tell you what you can and cannot do. Doesn't even pass the smell test.

My advice, if a Media Outlet tries to Groom you to think that nothing you do matters, don't ever read it again.

[–] [email protected] 18 points 7 months ago

Closed it as soon as I saw the paywall anyway

[–] fukurthumz420 14 points 7 months ago

god, i love this statement. it's so true. people have to understand our collective power. even if the only tool we have is a hammer, we can still beat their doors down and crush them with it. all it takes is organization and willingness.

load more comments (5 replies)
[–] [email protected] 70 points 7 months ago (3 children)

Any pay wall that let's you read that much article before showing itself to be behind a pay wall can burn in hell and would have no hope of getting my business purely out of spite.

[–] Rolando 15 points 7 months ago (5 children)

FWIW if you turn off scripts you can see the whole article.

load more comments (5 replies)
load more comments (2 replies)
[–] [email protected] 59 points 7 months ago (3 children)

"We need you to reconsider... because we already did it and we're just looking for your stamp of approval after the fact."

[–] DarkCloud 29 points 7 months ago (1 children)

AI has barely started infecting things, it's still avoidable... Yet even at this early stage it's obvious these companies have no morality and are willing to break laws and violate social norms.

It's obvious they're evil and they've barely just begun.

[–] [email protected] 14 points 7 months ago

Corporations are as callous and mechanical as they have always been, with an ever expanding range of tools to exploit. They will do anything and everything they can unless it is less profitable to do it.

load more comments (2 replies)
[–] fukurthumz420 53 points 7 months ago (2 children)

our collective time would be better spent destroying capitalism than trying to stop AI. AI is wonderful in the right social system.

[–] jj4211 12 points 7 months ago (6 children)

On the other hand, assuming the social system isn't the right one, hypothetically AI fully realized could make it more unreasonable and more tightly stuck the way it is.

load more comments (6 replies)
load more comments (1 replies)
[–] pixxelkick 37 points 7 months ago (8 children)

I mean, that's just how it has always worked, this isn't actually special to AI.

Tom Hanks does the voice for Woody in Toy Story movies, but, his brother Jim Hanks has a very similar voice, but since he isnt Tom Hanks he commands a lower salary.

So many video games and whatnot use Jim's voice for Woody instead to save a bunch of money, and/or because Tom is typically busy filming movies.

This isn't an abnormal situation, voice actors constantly have "sound alikes" that impersonate them and get paid literally because they sound similar.

OpenAI clearly did this.

It's hilarious because normally fans are foaming at the mouth if a studio hires a new actor and they sound even a little bit different than the prior actor, and no one bats an eye at studios efforts to try really hard to find a new actor that sounds as close as possible.

Scarlett declined the offer and now she's malding that OpenAI went and found some other woman who sounds similar.

Thems the breaks, that's an incredibly common thing that happens in voice acting across the board in video games, tv shows, movies, you name it.

OpenAI almost certainly would have won the court case if they were able to produce who they actually hired and said person could demo that their voice sounds the same as Gippity's.

If they did that, Scarlett wouldn't have a leg to stand on in court, she cant sue someone for having a similar voice to her, lol.

[–] xhieron 50 points 7 months ago (13 children)

She sure can't. Sounds like all OpenAI has to do is produce the voice actor they used.

So where is she? ...

Right.

[–] NeoNachtwaechter 11 points 7 months ago (1 children)

Get real. They have made it like her deliberately. Not anybody "nearly alike". They even admitted it.

load more comments (1 replies)
load more comments (12 replies)
[–] [email protected] 20 points 7 months ago* (last edited 7 months ago) (5 children)

Yes but also no, the whole appeal is tied to her brand (her public image x the character HER), unlike Woody who is an original creation.

It's like doing a commercial using a lookalike dressed like the original guy and pretending that's a completely different actor.

load more comments (5 replies)
[–] [email protected] 13 points 7 months ago (7 children)

The difference is that apparently they asked ScarJo first and she said no. When they ask Tom Hanks (or really his agent, I assume) the answer is "he's too busy with movies, try Jim".

load more comments (7 replies)
[–] BrianTheeBiscuiteer 10 points 7 months ago

Well, in the "soundalike" situation you describe people were getting paid to voice things. Now it's just an AI model that's not getting paid and the people that made the model probably got paid even less than a soundalike voice actor would. It's just more money going to the top.

[–] athairmor 10 points 7 months ago

Scarlett actually would have a good case if she can show the court that people think it’s her. Tom Waits won a case against Frito Lay for “voice misappropriation” when they had someone imitate his voice for a commercial.

load more comments (3 replies)
[–] Alpha71 35 points 7 months ago

"Yeah, let's go up against the woman who sued Disney and won What could go wrong!?"

[–] [email protected] 32 points 7 months ago
[–] homesweethomeMrL 28 points 7 months ago

The Johansson scandal is merely a reminder of AI’s manifest-destiny philosophy: This is happening, whether you like it or not.

It’s just so fitting that microsoft is the company most fervently wallowing in it.

[–] FlyingSquid 15 points 7 months ago (2 children)

I hate that I have to keep saying this- No one seems to be talking about the fact that by giving their AI a human-like voice with simulated emotions, it inherently makes it seem more trustworthy and will get more people to believe its hallucinations are true. And then there will be the people convinced it's really alive. This is fucking dangerous.

load more comments (2 replies)
[–] [email protected] 12 points 7 months ago
[–] suction 10 points 7 months ago (3 children)

It’s still just LLM and therefore just autocomplete

load more comments (3 replies)
load more comments
view more: next ›