this post was submitted on 23 Aug 2023
19 points (100.0% liked)

SneerClub

983 readers
73 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
all 23 comments
sorted by: hot top controversial new old
[–] [email protected] 17 points 1 year ago* (last edited 1 year ago) (2 children)

Just sneering at a couple of comments, mostly the first.

This situation is best modeled by conflict theory, not mistake theory.

I thought rationalists were supposed to be strict mistake theorists (in their own terms). Seeing someone here essentially say, "Their opposition to us can't be resolved simply, just like how issues in the world are complex and not simple mistakes," when they actually believe (as any good liberal/nxr would) that any societal issue is a simple mistake to be corrected is... weird.

Since that does not seem likely to be the sort of answer you’re looking for though, if I wanted to bridge the inferential gap with a hypothetical Sneer Clubber who genuinely cared about truth, or indeed about anything other than status (which they do not)

This is the finest copium. Pure, uncut. Yes, I'm here to "boost my status" by collecting internet points. Everyone knows my name and keeps track of how cool I am. I don't sleep in a hotel and I own triples of every classic car. Triples makes it safe.

If you think that the conventional way to approach the world is usually right, the rationalist community will seem unusually stupid. We ignore all this free wisdom lying around and try to reinvent the wheel! If the conventional wisdom is correct, then concerns about the world changing, whether due to AI or any other reason, are pointless. If they were important, conventional wisdom would already be talking about them.

Hey, don't try to position yourselves as the plucky underdog/maverick here. That's a culture war move, and you aren't allowed to do that!

/r/SneerClub users are not the sort of entities with whom you can have that conversation. You might as well ask a group of chimpanzees why they're throwing shit at you.

LW talking to us would be more like this: a group of chimpanzees is throwing shit at some LWers. The LWers ask the chimps why. The chimps explain, using everyday language and concepts, that they think the worldview of the LWers is wrong and skewed in weird directions, and that any time someone tries to explain this, the chimps are met with condescension and the accusation that they can't understand the LWers because they are chimps. So in protest, the chimps explain they throw shit. The LWers shrug and say they can't understand what the chimps are saying, because they are chimps and chimps can't speak human language. The chimps continue to throw shit.

I think Sneer Club understands the Less Wrong worldview well enough. They just happen to reject it.

Least wrong LWer.

[–] [email protected] 18 points 1 year ago (1 children)

I think Sneer Club understands the Less Wrong worldview well enough. They just happen to reject it.

Wow, someone gets it.

[–] [email protected] 6 points 1 year ago

I heard if you post "sneerclub could be right [about $x]" over there, your LW membership card instantly catches fire and big yud kicks a future-hypothetical puppy

[–] [email protected] 16 points 1 year ago (5 children)

I would also tell them that it’s possible to actually understand things. Most people seem to go through life on rote, seemingly not recognizing when something doesn’t make sense because they don’t expect anything to make sense.

Oh my god could you sound any more flushable.

Just fuck yourself. Everybody is thinking all of the fucking time, everybody has blindspots, stop being so insufferable.

They're so obsessed with this narrative that keeps them ahead of the curve, guess what ex gifted kids? Other people catch up, you're not competing with 4 year olds anymore.

[–] [email protected] 11 points 1 year ago

I would also tell them that it’s possible to actually understand things.

The most perfect set-up to date for a joke about how the Thing Understander has logged on

[–] elmtonic 9 points 1 year ago* (last edited 1 year ago) (3 children)

https://xkcd.com/610/

I think a lot of rats have this idea that they arrived at their views and values solely by thinking really hard (and being really really smart). Which means that anyone who doesn't share their same basic views is simply a mouthbreathing NPC who doesn't have any curiosity in "the way the world works" - when in reality, people just have a lot of other shit on their minds, and tend to care about less abstract problems than [insert sci-fi trope here].

It's funny that the commenter talks so much about how people should just try to understand things, and in the same breath fails to try to empathize with people who think differently.

[–] [email protected] 11 points 1 year ago

Not to mention that even people who do think about those abstract problems can come to different conclusions than them.

A lot of the stuff in the tip of the rationalist iceberg, the obsession with intelligence and logic, the quasi-dogmatic faith in exponentially improving technology, even stepping my toes into the eugenics territory and getting burned, these are all things I remember from my tween years. I have been in that headspace, thinking of all the same questions every single self-appointed child genius LessWronger is obsessed with. And yet, I came out of it all as a leftist SJW with a raging disdain for the techno-financial-idustrial complex.

What happened is that I grew up, had my views challenged, met and befriended more diverse groups of people, learned about alternative views and synthesized them with my own, gaining a more nuanced understanding of issues I thought I had "solved" in my head by the age of 15. The worst thing that could have happened to me is convincing everyone who disagrees with me is an NPC and walling myself in a bubble full of other insecure logic bros all firmly locked in the adolescent mindset of having figured everything out forever.

[–] [email protected] 4 points 1 year ago

Aumann's Agreement Theorem says that I'm right so therefore you're wrong

[–] [email protected] 3 points 1 year ago (1 children)

I mean it seems to me these people just copy what they think a smart person on the internet says, and theres little actual thought about things.

[–] [email protected] 7 points 1 year ago

Yeah, except instead of copying what a smart person says, they're copying Eliezer Yudkowsky.

[–] [email protected] 5 points 1 year ago

Sneerious rephrasing:

I would also tell them that the rationality cargo culting, is the only way to portray understanding things,
fake it till you make it, you can solve this insecurity that I have about myself in you!
Other people are untinking NPCs, they don't see the cracks in the matrix (and PC hellscape) like I do.

[–] [email protected] 5 points 1 year ago

Hey, don't lump us "gifted" folks in with LW~ I survived Talented & Gifted; I stayed in school, studied, and learned about the world. Yud's contention is that I should have dropped out and read sci-fi books all day.

I do read sci-fi all day, though... Maybe we're not so different...

[–] [email protected] 4 points 1 year ago

@naevaTheRat @dgerard People criticizing rationalism aren't suggesting we can't understand things, but we do often suggest we can't understand everything, or even some of the specific things rationalists claim to understand.

[–] [email protected] 16 points 1 year ago* (last edited 1 year ago)

What would you say to them, to plant the seed of changing their mindset from their current one?

making this - aka pilling - your specific goal, rather than something you would hope for as a side-effect of your sentiment, is a rookie mistake.

[–] [email protected] 12 points 1 year ago* (last edited 1 year ago)

first comment,

If the conventional wisdom is correct, Bayesianism is potentially wrong (it’s not part of the Standard Approach to Life), and [certainly useless] [...]

what was actually said:

the abandonment of interpretation in favor of a naïve approach to statistical [analysis] certainly skews the game from the outset in favor of a belief that data is intrinsically quantitative—self-evident, value neutral, and observer-independent. This [belief excludes] the possibilities of conceiving data as qualitative, co-dependently constituted. (Drucker, Johanna. 2011. “Humanities Approaches to Graphical Display.”)

the latter isn't even claiming that the bayesian (statistical analysis) is "useless" but that it "skews the game [...] in favor of a belief". the very framing is a misconstrual of the nature of the debate.

[–] [email protected] 8 points 1 year ago (1 children)

My god some spectacularly bad takes on here. The top-rated one is just bananas-level stupid:

While there may be a substantial worldview gap, I suspect the much larger difference is that most Sneer Clubbers are looking to boost their status by trying to bully anyone who looks like a vulnerable target, and being different, as LessWrong is, is enough to qualify.

Yes, that's right. I spend my time boosting my status by (checks notes) commenting on the internet.

[–] [email protected] 8 points 1 year ago (1 children)

karma points on reddit are worth so much more than karma points on lesswrong

and the ones on awful systems, MY GOD

[–] [email protected] 5 points 1 year ago

what did we decide the exchange rate on definitely real karma points for a harrier jet was again?

[–] [email protected] 7 points 1 year ago

Also, lol @ this exchange: SneerClubMod:

I mod the subreddit, I’m aware of the history of the sidebar quote[\n] Your interpretation of why we did that is incorrect

LWer:

Then why did you do that?

SneerClubMod

Because it’s a funny example of Yudkowsky’s persecution complex, and therefore an amusing ironic self-appellation

How do they not read this and self-reflect? Oh, that's right, they're in a cult.

[–] [email protected] 5 points 1 year ago

If the conventional wisdom is correct, Bayesianism is potentially wrong (it’s not part of the Standard Approach to Life), and certainly useless: why try to learn through probability theory when tradition can tell you everything you need to know much faster?

Oh, you say you're a Bayesian? Name all 46,656 varieties.

[–] [email protected] 3 points 9 months ago

I think Sneer Club understands the Less Wrong worldview well enough. They just happen to reject it.

One of the few comments that actually make sense to me here.

[–] [email protected] 3 points 1 year ago

When I was a teenager, I read every novel by Isaac Asimov, including those that I could only find in second-hand bookshops (A Whiff of Death, Murder at the ABA and The End of Eternity). I read most of his short fiction, too; I didn't hunt down the ephemera that had never been anthologized, but I did visit the archive at the Boston University Library and find the movie plot outline that he wrote at the request of Paul McCartney. On the nonfiction side, to mention only the thickest books. I read his Chronology of Science and Discovery in sixth grade, and I followed it up with Asimov's Chronology of the World and his two-volume guides to Shakespeare and the Bible both.

It's not that I fail to understand where LessWrong is coming from. It's that I actually grew up to become a scientist.