ShakingMyHead

joined 5 months ago
[–] [email protected] 8 points 1 month ago

Newsom also signed AB 1008, which clarifies that any personal data fed to an AI model retains the same privacy rights it would otherwise — including the consumer’s right to correct and delete their personal information. That’ll be a fun one to implement.

I think what it actually clarified is that personal information generated from an AI model are now covered under the law, instead of just what is used as training data.

[–] [email protected] 5 points 1 month ago

And, of course, we know that once one company uses those chats, it becomes literally impossible for anyone else to use them.

[–] [email protected] 11 points 1 month ago (2 children)

Investors demand growth. The problem is that Microsoft has basically won Capitalism and has no real area to grow to.

[–] [email protected] 5 points 1 month ago (1 children)

I don't think they're planning to build the 5GW datacentres on TMI, though.

[–] [email protected] 4 points 1 month ago (3 children)

The theoretical Three Mile Island datacentre is probably the end-point of all of this. I think that what they build there (should it actually happen) has to be AGI. Like, unambiguously AGI. Where even the most hardline AI skeptics will react with "Holy shit they actually did it."

Anything less and it'll be nearly impossible to justify building 5 nuclear reactors to just power one of these datacentres.

So even if it is happening, it's still not happening.

[–] [email protected] 8 points 1 month ago (6 children)

Building 5-7 5GW facilities full of GPUs is going to be an extremely large amount of silicon. Not to mention the 25-35 nuclear power plants they apparently want to build to power them.
So on the list of things not happening...

[–] [email protected] 4 points 2 months ago* (last edited 2 months ago)

I believe the future is going to be so bright that no one can do it justice by trying to write about it now.

Uh

[–] [email protected] 6 points 2 months ago

Finally, of course, it is very much not just rationalists who believe that AI represents an existential risk. We just got there twenty years early.

This one?

[–] [email protected] 8 points 2 months ago

We could also just fluoridate the water supply, which also massively reduces cavities.

[–] [email protected] 11 points 2 months ago (2 children)

Obligatory note that, speaking as a rationalist-tribe member, to a first approximation nobody in the community is actually interested in the Basilisk and hasn’t been for at least a decade.

Sure, but that doesn't change that the head EA guy wrote an OP-Ed for Time magazine that a nuclear holocaust is preferable to a world that has GPT-5 in it.

view more: ‹ prev next ›