this post was submitted on 12 Oct 2023
50 points (96.3% liked)

SneerClub

983 readers
31 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

Rationalist check-list:

  1. Incorrect use of analogy? Check.
  2. Pseudoscientific nonsense used to make your point seem more profound? Check.
  3. Tortured use of probability estimates? Check.
  4. Over-long description of a point that could just have easily been made in 1 sentence? Check.

This email by SBF is basically one big malapropism.

all 39 comments
sorted by: hot top controversial new old
[–] [email protected] 20 points 1 year ago* (last edited 1 year ago) (6 children)

This reads very, uh, addled. I guess collapsing the wavefunction means agreeing on stuff? And the uncanny valley is when the vibes are off because people are at each others throats? Is 'being aligned' like having attained spiritual enlightenment by way of Adderall?

Apparently the context is that he wanted the investment firms under ftx (Alameda and Modulo) to completely coordinate, despite being run by different ex girlfriends at the time (most normal EA workplace), which I guess paints Elis' comment about Chinese harem rules of dating in a new light.

edit: i think the 'being aligned' thing is them invoking the 'great minds think alike' adage as absolute truth, i.e. since we both have the High IQ feat you should be agreeing with me, after all we share the same privileged access to absolute truth. That we aren't must mean you are unaligned/need to be further cleansed of thetans.

[–] [email protected] 10 points 1 year ago

since we both have the High IQ feat you should be agreeing with me, after all we share the same privileged access to absolute truth. That we aren’t must mean you are unaligned/need to be further cleansed of thetans.

They have to agree, it's mathematically proven by Aumann's Agreement Theorem!

[–] [email protected] 8 points 1 year ago (2 children)

Unhinged is another suitable adjective.

It's noteworthy that how the operations plan seems to boil down to "follow you guts" and "trust the vibes", above "Communicating Well" or even "fact-based" and "discussion-based problem solving". It's all very don't think about it, let's all be friends and serve the company like obedient drones.

This reliance on instincts, or the esthetics of relying on instincts, is a disturbing aspect of Rats in general.

[–] [email protected] 6 points 1 year ago

Well they are the most rational people on the planet. Bayesian thinking suggests that their own gut feelings are more likely to be correct than not, obviously.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

"guys check it out. we're so rat. we're so fucking rat we can do whole corpposts in only rat. check it out guys. guys??! where are you going? IMMA RAT THIS UNCANNY VALUE SO FUCKEN HARD OHHHH YEAH"

- me, doing an SBF impression. digital media. 2023

[–] Theharpyeagle 6 points 1 year ago* (last edited 1 year ago)

I love the notion that the tension between the two was "created out of thin air," and the solution is to drag both parties into it and make them say "either I agree with you or I'm a big stupid dummy who doesn't care about my company."

[–] [email protected] 6 points 1 year ago

the amazing bit here is that SBF's degree is in physics, he knows the real meanings of the terms he's using for bad LW analogies

[–] CodexArcanum 4 points 1 year ago (1 children)

As cringe as "wave function collapse" and "uncanny valley" are for the blatant misuse here, "alignment" is just another rich asshole meme right now. "Alighnment" is just a fancy way to say "agreeing with me." They want employees "aligned" with their "vision" aka indulging every stupid whim without pushback. They want AI to be "aligned" too and are extremely frightened that because they understand absolutely nothing about computers, there's a possibility a machine might NOT do whatever stupid thing they say, so every software needs backdoors and "alignment" to ensure the CEO always has a way to force their will.

WFC appears to just mean concepts that he doesn't understand or doesn't know about. He's mistaken the idea that things can be in a complex mix of states for "things haven't gone my way yet, or I don't know what I want." Uncanny Valley he appears to think just means "when I'm uncomfortable and not getting my way." He, of course, mixes his metaphors and starts talking about "collapsing the valley" which is not a thing.

Fucking moron.

[–] [email protected] 5 points 1 year ago

while you appear to be directionally correct with your observations, are you aware where you're posting and who this sub is about?

because "just another rich asshole meme" is, unfortunately, not all this is. it might be that as well, but it is also something else.

also, if you didn't know what this was about, and my comment makes you find out: consadulations and welcome to the club

[–] [email protected] 2 points 1 year ago

Ayn Rand and Gene Ray would agree, at least with the principle.

[–] [email protected] 17 points 1 year ago* (last edited 1 year ago) (4 children)

A reminder that Rationalists have absolutely No Fucking Clue what they're talking about when it comes to quantum mechanics, and this is evident from the very top.

Here is their prophet's, Eliezer Yudkowsky's, brilliant writings on QM: https://www.lesswrong.com/posts/5vZD32EynD9n94dhr/configurations-and-amplitude

In this stunning vindication of Dunning-Kruger, EY sets up a thought experiment of a photon being ejected at a half-silvered mirror. Then, he realizes that QM is formulated with complex numbers, so he decides to shoehorn them by imagining a "computer program" that computes the result of the experiment and using the complex numbers as the internal state (because he read somewhere that a wave function is a complex-valued function). From there, he goes on to realize that he needs to actually justify the use of complex numbers, so he drops the fact that multiplying the "internal state" by i represents the photon turning 90 degrees (what?! yes, multiplying by i rotates complex numbers by 90 degrees but this has literally nothing to do with the direction the photon travels, what the ACTUAL fuck am I reading?)

I seriously want to pull my hair out after reading this asinine nonsense. MIT OCW's QM course is extremely accessible to anyone with a decent high-school math education but these chucklefucks' need to prove to themselves that they're smart supercedes any process of actual learning.

edit because I can't stop sneering: "wave function collapse" is purely born of the Copenhagen interpretation which EY rails against as ridiculous (which, admittedly, isn't a totally unpopular opinion for real physicists to have). This is, of course, 100% lost on SBF.

[–] [email protected] 9 points 1 year ago

If you spend as many words as Yud does, you could actually teach quantum mechanics. But his goal isn't to teach physics; it's to convince the reader that physicists are all wrong about physics. It's cult shit disguised as a science lesson.

[–] [email protected] 8 points 1 year ago

Yeah but the remarks about “wave function collapse” are not about QM, they are cargo cult stuff. To create the appearance of being learned and smart.

QED.

[–] [email protected] 5 points 1 year ago (1 children)

What I think happened is that he got confused by the half mirror phase shifts (because theres only a phase shift if you reflect off the front of the mirror, not the back). Instead of asking someone, he invented his own weird system which gets the right answer by accident, and then refused to fix the mistake ever, saying that the alternate system is fine because it's "simpler".

[–] [email protected] 3 points 1 year ago (1 children)

That blog post irritates me in multiple directions every time I am reminded of it. The wrongness is so layered that any response I attempt degenerates into do you even Bloch sphere, bro before I give up and find something more worthwhile to do with my life.

[–] [email protected] 3 points 1 year ago

Yud loves to go on about how the map is not the territory, to the extent that his cult followers think he coined the phrase, but he is remarkably terrible at understanding which is which. Or, to be a little more precise, he is actively uninterested in appreciating that the question of what to file under "map" versus "territory" is one of the big questions that separate the different interpretations of quantum mechanics. He has his desired answer, and he argues for it by assertion.

He's also just ignorant about the math. Stepping back from the details of what he gets wrong, there are bigger-picture problems. For example, he points to a complex number and says that it can't be a probability because it's complex. True, but so what? The Fourier transform of a sequence of real numbers will generally have complex values. Just because one way of expressing information uses complex numbers doesn't mean that every perspective on the problem has to. And, in fact, what he tries to do with two complex numbers — one amplitude for each path in an interferometer — you can actually do with three real numbers. They can even be probabilities, say, the probability of getting the "yes" outcome in each of three yes/no measurements. The quantumness comes in when you consider how the probabilities assigned to the outcomes of different experiments all fit together. If probabilities are, as Yud wants, always part of the "map", and a wavefunction is mathematically equivalent to a set of probabilities satisfying some constraint, then a wavefunction belongs in the "map", too. You can of course argue that some probabilities are "territory"; that's an argument which smart people have been having back and forth for decades. But that's not what Yud does. Instead, through a flavor swirl of malice and incompetence, he ends up being too much a hypocrite to "steelman" the many other narratives about quantum mechanics.

[–] [email protected] 2 points 1 year ago (2 children)

How can anyone take a fanfic writer seriously.

[–] [email protected] 10 points 1 year ago (3 children)

I used to read fanfiction, and by the standards of Harry Potter fanfiction, it's not even good fanfiction.

~~(insert "Senator, you're no Jack Kennedy" joke here)~~

Maybe it's just a matter of taste, but I couldn't get through more than a chapter. I wonder if most of the audience for it were people who didn't normally read fanfiction. Actually, I just looked it up on fanlore to see what fandom people have said about it, and the reviews are mixed....

"I read it longer than I planned to because I kept expecting it to turn into Harry/Draco slash [...] But then I realized the author was just a weird neckbeard who had some kind of strange Draco fixation but was probably not going to make them go gay. Also it was just a really bad fic."

Lots of gold in there. Apparently Eliezer was bullied by a Harry Potter fan forum, to the point that some of the users set up a blog called "Methods of Rationality sucks".

[–] [email protected] 10 points 1 year ago (2 children)

Apparently Eliezer was bullied by a Harry Potter fan forum, to the point that some of the users set up a blog called “Methods of Rationality sucks”.

His reaction was where Sneerclub got its name

[–] [email protected] 7 points 1 year ago

That quote has always been mildly amusing but learning it's from a fanfic beef makes it hilarious.

[–] [email protected] 4 points 1 year ago

And people gave this man money. Lots of money. My God.

[–] [email protected] 5 points 1 year ago

It's as if /r/iamverysmart sat down to write it. So clearly it's Eliezer self-inserting himself into a children's book as a Jimmy Neutron wannabe and trying to pass is as a legitimate improvement.

[–] [email protected] 3 points 1 year ago

@dingleberry @sinedpick At least fanfic writesr are not Yudkowsky fanboys.

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago) (1 children)

SBF: What if we kissed in the uncanny valley 😳😳😳

- These docs in a nutshell

[–] [email protected] 8 points 1 year ago (1 children)

The most charitable reading of this is that, removed from context, it is an attempt to bridge some kind of gap (i.e. an “uncanny valley”) between two companies, that gap being some amalgamation of communication issues, values differences, work styles, really just every aspect of a company’s identity. So it’s fucking hilarious that he has chosen to write this doc using the most esoteric, faux-philosophical, alienating drivel possible.

WITH context, however, it is even more absurd. It’s still the above, with an extra undercoat of trying to resolve a love triangle between at least two rat-dorks. Truly a masterpiece.

[–] [email protected] 8 points 1 year ago

I have nothing to add. Here's my up arrow.

[–] [email protected] 8 points 1 year ago (1 children)

Are we keeping a list of things that refute CEO worship? Here's an item for the list.

[–] [email protected] 6 points 1 year ago (1 children)

People will just no true scotsman your list.

[–] [email protected] 2 points 1 year ago

Have an upvote.

[–] [email protected] 7 points 1 year ago (1 children)

I’d like to see this text superimposed into a Google calendar meeting request

[–] [email protected] 10 points 1 year ago (2 children)
[–] [email protected] 8 points 1 year ago (1 children)

it’s like the negative thoughts I have whenever I convince myself I’m not cutout to create my own business. I’m sure I’d be a self-indulgent weirdo like that.

[–] [email protected] 9 points 1 year ago* (last edited 1 year ago)

I’m sure I’d be a self-indulgent weirdo like that.

Hey, don't talk about my acquaintance from the niche internet forum that criticises other niche internet forums that way.

More seriously, I hope those thoughts aren't holding you back. I'd like to think that by being in sneerclub you are inoculated from transforming into the specific kind of self-indulgent weirdo on display here.

[–] [email protected] 8 points 1 year ago

all alignment and no play makes jack a dull boy