this post was submitted on 04 Mar 2024
47 points (92.7% liked)

SneerClub

983 readers
26 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

content warning: Zack Davis. so of course this is merely the intro to Zack's unquenchable outrage at Yudkowsky using the pronouns that someone wants to be called by

(page 2) 29 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 8 months ago (1 children)

Quickly recapping my Whole Dumb Story so far: ever since puberty, I've had this obsessive sexual fantasy about being magically transformed into a woman, which got contextualized by these life-changing Sequences of blog posts by Eliezer Yudkowsky that taught me (amongst many other things) how fundamentally disconnected from reality my fantasy was. So it came as a huge surprise when, around 2016, the "rationalist" community that had formed around the Sequences seemingly unanimously decided that guys like me might actually be women in some unspecified metaphysical sense.

Goddamn, what an opening. I really hope this is an elaborate troll and not someone driven insane by internet transphobia and the Sequences.

Not gonna all read that, though.

[–] [email protected] 3 points 8 months ago

I really hope this is an elaborate troll

i am sorry to tell you

[–] [email protected] 7 points 8 months ago (12 children)

While the writer is wrong, the post itself is actually quite interesting and made me think more about epistemic luck. I think Zack does correctly point out cases where I would say rationalists got epistemically lucky, although his views on the matter seem entirely different. I think this quote is a good microcosm of this post:

The Times's insinuation that Scott Alexander is a racist like Charles Murray seems like a "Gettier attack": the charge is essentially correct, even though the evidence used to prosecute the charge before a jury of distracted New York Times readers is completely bogus.

A "Gettier attack" is a very interesting concept I will keep in my back pocket, but he clearly doesn't know what a Gettier problem is. With a Gettier case a belief is both true and justified, but still not knowledge because the usually solid justification fails unexpectedly. The classic example is looking at your watch and seeing it's 7:00, believing it's 7:00, and it actually is 7:00, but it isn't knowledge because the usually solid justification of "my watch tells the time" failed unexpectedly when your watch broke when it reached 7:00 the last time and has been stuck on 7:00 ever since. You got epistemically lucky.

So while this isn't a "Gettier attack" Zack did get at least a partial dose of epistemic luck. He believes it isn't justified and therefore a Gettier attack, but in fact, you need justification for a Gettier attack, and it is justified, so he got some epistemic luck writing about epistemic luck. This is what a good chunk of this post feels like.

load more comments (11 replies)
[–] [email protected] 5 points 8 months ago* (last edited 8 months ago) (1 children)

Is Zack a big cheese in the community or just some random internet person who spams people with emails?

[–] [email protected] 5 points 8 months ago

he's one of the rationalists, firmly in the subculture, not a driveby

load more comments
view more: ‹ prev next ›