this post was submitted on 20 Nov 2023
1514 points (98.5% liked)

Technology

60111 readers
3536 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 241 points 1 year ago (7 children)

I'd like to know why exactly the board fired Altman before I pass judgment one way or the other, especially given the mad rush by the investor class to re-instate him. It makes me especially curious that the employees are sticking up for him. My initial intuition was that MSFT convinced Altman to cross bridges that he shouldn't have (for $$$$), but I doubt that a little more now that the employees are sticking up for him. Something fucking weird is going on, and I'm dying to know what it is.

[–] scarabic 85 points 1 year ago* (last edited 1 year ago) (1 children)

Wanting to know why is reasonable but it’s sus that we don’t already know. Why haven’t they made that clear? How did they think they could do this without a solid explanation? Why hasn’t one been delivered to set the rumors to rest?

It stinks of incompetence, or petty personal drama. Otherwise we’d know by now the very good reason they had.

[–] Tangent5280 43 points 1 year ago (2 children)

If there was something illegal going on, then all parties involved would have incentive to keep it under wraps.

[–] [email protected] 28 points 1 year ago

And possibly legal orders to not discuss it in public

[–] scarabic 3 points 1 year ago

If this circus is what they consider “under wraps” then I don’t know what.

[–] [email protected] 51 points 1 year ago* (last edited 1 year ago) (1 children)

Altman wanted profit. Board prioritized (rightfully, and to their mission) responsible, non-profit care of AI. Employees now side with Altman out of greed and view the board as denying them their mega payday. Microsoft dangling jobs for employees wanting to jump ship and make as much money possible. This whole thing seems pretty simple: greed (Altman, Microsoft, employees) vs the original non-profit mission (the board).

Edit: spelling

[–] [email protected] 9 points 1 year ago (1 children)

That's what I thought it was at first too. But regular employees aren't usually all that interested in their company being profit driven. Especially AI researchers. Most of those that I know are extremely passionate about ethics in AI.

But do they know things we don't know? They certainly might. Or it might just be bandwagoning or the likes.

[–] [email protected] 8 points 1 year ago

But regular employees aren't usually all that interested in their company being profit driven. Especially AI researchers. Most of those that I know are extremely passionate about ethics in AI.

I would have thought so too of the employees, but threatening a move to Microsoft kinda says the opposite. That or they are just all-in on Altman as a person.

[–] [email protected] 34 points 1 year ago

The only explanation I can come up with is that the workers and Altman both agreed in monetizing AI as much as possible. They're worried that if the board doesn't resign, the company will remain a non-profit more conservative in selling its products, so they won't get their share of the money that could be made.

[–] [email protected] 10 points 1 year ago (1 children)

Yeah, the speed at which MS snapped him up makes me think of Zampella and West from Infinity Ward.

[–] Chocrates 2 points 1 year ago (1 children)

Microsoft Stock dropped 2% with the announcement, hiring him was just to stop the hemorrhaging while they figure out what to do.

[–] [email protected] 3 points 1 year ago

Isn't that more because MS own lots of OpenAI stock? But then 2% is neither here nor there anyway. More background noise than anything.

[–] [email protected] 5 points 1 year ago (7 children)
[–] [email protected] 38 points 1 year ago

The tone of the blog post is so amateurish I feel like I'm reading a reddit post on r/Cryptocurrency

[–] I_Clean_Here 22 points 1 year ago

Don't get me wrong, this move from the board reeks of some grade A bullshit but this article is absolute crap. Is this supposed to be a serious journalism?

[–] [email protected] 16 points 1 year ago (1 children)

Thanks for sharing. That is... Weird in ways I didn't anticipate. "Weird cult of pseudointellectuals upending the biggest name in silicon valley" wasn't on my bingo board.

[–] [email protected] 14 points 1 year ago (2 children)

IMO there are some good reasons to be concerned about AI, but those reasons are along the lines of "it's going to be massively disruptive to the economy and we need to prepare for that to ensure it's a net positive", not "it's going to take over our minds and turn us into paperclips."

[–] SpruceBringsteen 7 points 1 year ago

Social media already did that.

Not the paperclips part, that might actually be of some use.

[–] [email protected] 2 points 1 year ago

The author did a poor job of explaining that. He’s referencing the thought experiment of a businessman instructing a super effective AI to make paperclips. Given a terse enough objective and an effective enough AI, one can imagine a scenario in which the businessman and the whole world in fact are turned into paperclips. This is obviously not the businessman’s goal, but it was the instruction he gave the AI. The implication of the thought experiment is that AI needs guardrails, perhaps even ethics, or else it can unintentionally result in a doomsday scenario.

[–] [email protected] 6 points 1 year ago

I don't know a lot about the background but this article feels super biased against one side.

[–] [email protected] 2 points 1 year ago (1 children)

Can somebody explain the following quote in the article for me please?

Rationalists’ chronic inability to talk like regular humans may even explain the statement calling Altman a liar.

[–] [email protected] 1 points 1 year ago

Imagine "roko's basilisk", but extended into an entire philosophy. It's the idea that "we" need to anything and everything to create the inevitable ultimate super-ai, as fast as possible. Climate change, wars, exploitation, suffering? None of that matters compared to the benefits humanity stands to gain when the ultimate super-ai goes online

[–] [email protected] 1 points 1 year ago

A duel between hucksters and the delusional makes sense. The delusional rely on the hucksters for funding whether they want to or not though. No heroes.

[–] CrayonRosary 1 points 1 year ago* (last edited 1 year ago)

That was an entertaining read. Thank you.

Even better, though, was this linked article about humans running AI behind the curtain.

https://amycastor.com/2023/09/12/pivot-to-ai-pay-no-attention-to-the-man-behind-the-curtain/

[–] [email protected] 1 points 1 year ago (1 children)

I don't think msft convinced him with money, but rather opportunity. He clearly still wants to work with AI and 2nd best place for that after openAI is Microsoft

[–] [email protected] 1 points 1 year ago (1 children)

Second best would be Google, but for him it's Microsoft because he's probably getting a sweetheart deal as being in control of his destiny (not really, but at least for a short while)

[–] [email protected] 0 points 1 year ago (1 children)

Microsoft has access to a lot of OpenAI's code, weights etc. and he's already been working with them. It would be much better for him than to join some other company he has no experience with.

[–] [email protected] 5 points 1 year ago

He's not the guy who writes code, he's a VC or management guy. You might say he has good ideas, as ChatGPT interface is attributed to him, but he didn't make it.