submitted 2 months ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 8 points 2 months ago* (last edited 2 months ago)

I like how the takeaway is not "Once AI exceeds our ability to understand or compete with it, humans will not be the ones building or controlling it anymore. We're in for a very very different future than what currently exists, so we better be pretty fuckin responsible with what trajectory is dialed in at the moment that that happens, because once the line is crossed there's no going back. I mean, we could maybe have a conversation about whether doing this is even a good idea in the first place, but the possibility of preventing it seems more vanishingly remote with every passing year, so at least we could make an emergency crash priority out of AI safety, like a couple of years ago ideally but definitely right now."

No, the takeaway is "Hey guys here's some career advice for the short term. I will not be taking questions concerning anything after that. Hey we made a new chip BTW."

[-] Anticorp 2 points 2 months ago* (last edited 2 months ago)

The underlying message is "buy our stock because our GPUs power AI"

[-] [email protected] 2 points 2 months ago

I agree with you wholeheartedly. Powerful CEOs are not the almighty visionaries they want people to believe they are.

Sometimes, it's just hyperbole meant to prop their business up.

this post was submitted on 28 Feb 2024
26 points (78.3% liked)


729 readers
43 users here now

Social Experiment. Become Me. What I see, you see.

founded 11 months ago