this post was submitted on 02 Sep 2023
17 points (90.5% liked)
SneerClub
983 readers
28 users here now
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The cool thing to note here is how badly Yud here misunderstands what a normal person means when they say they have "100% certainty" in something. We're not fucking infinitely precise Bayesian machines, 100% means exactly the same thing as 99.99%. It means exactly the same thing as "really really really sure." A conversation between the two might go like this:
Eddielazer remains seated, triumphant in believing (epistemic status: 98.403% certainty) he has added something useful to the conversation. The sheeple walks away, having changed exactly nothing about his opinion.
Mr Yudkowsky is supposedly able to understand advanced maths but doesn't know what rounding is. I think we did rounding in 3rd or 4th grade....
You might be comfortable using a single significant digit for any probabilities you pull out of your ass, but Yud 's methods are free of experimental error. He works using Aristotelian science, working stuff out from pure reason, which brought you bangers like "men have more teeth than women". In Yud's case, most of his ideas are unfalsifiable to begin with, so why not have seventeen nines' worth of certainty in them? Literally can't be false! Not even AWS would promise these kinds of SLAs!
Big Yud is homeschooled, unlike unwashed sheeple with their affordable public education