this post was submitted on 02 Dec 2023
24 points (90.0% liked)

SneerClub

953 readers
100 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

At various points, on Twitter, Jezos has defined effective accelerationism as “a memetic optimism virus,” “a meta-religion,” “a hypercognitive biohack,” “a form of spirituality,” and “not a cult.” ...

When he’s not tweeting about e/acc, Verdon runs Extropic, which he started in 2022. Some of his startup capital came from a side NFT business, which he started while still working at Google’s moonshot lab X. The project began as an April Fools joke, but when it started making real money, he kept going: “It's like it was meta-ironic and then became post-ironic.” ...

On Twitter, Jezos described the company as an “AI Manhattan Project” and once quipped, “If you knew what I was building, you’d try to ban it.”

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 21 points 8 months ago (2 children)

the wonderful thing about this story is the effort Forbes went to to dox a nazi

[–] [email protected] 12 points 8 months ago

I spent way too much time arguing that NYT didn't dox Slatescott.

[–] [email protected] 12 points 8 months ago* (last edited 8 months ago) (1 children)

I highly suspect the voice analysis thing was just to confirm what they already knew, otherwise it would have been like looking for a needle in a haystack.

People on twitter have been speculating that someone who knew him simply ratted him out.

[–] [email protected] 14 points 8 months ago (2 children)

i mean, probably. but also, nazis are just dogshit at opsec.

[–] [email protected] 14 points 8 months ago

I still find it amusing that Siskind complained about being "doxxed" when he used his real first and middle name.

[–] [email protected] 9 points 8 months ago* (last edited 8 months ago) (2 children)

update: Verdon is now accusing another AI researcher of exposing him: https://twitter.com/GillVerd/status/1730796306535514472

[–] [email protected] 17 points 8 months ago (3 children)

posting a screenshot to preserve the cringe in its most potent form:

yeah BasedBeffJezos is just an ironic fascist persona that has nothing to do with who I am, that’s why I’m gonna threaten anyone who associates me with BasedBeffJezos

[–] [email protected] 11 points 8 months ago (1 children)

Whoever said that all twitter bluechecks talk like anime villains was spot on

[–] [email protected] 10 points 8 months ago

you might not like the consequences for exposing me as BasedEyesWHITEdragon yu-gi-boy!!!

[–] [email protected] 9 points 8 months ago

Dude doxxed protest too much.

load more comments (1 replies)
[–] [email protected] 9 points 8 months ago

Reading his timeline since the revelation is weird and creepy. It's full of SV investors robotically pledging their money (and fealty) to his future efforts. If anyone still needs evidence that SV is a hive mind of distorted and dangerous group-think, this is it.

[–] [email protected] 20 points 8 months ago

He noted that Jezos doesn’t reflect his IRL personality. “The memetics and the sort-of bombastic personality, it's what gets algorithmically amplified,” he said, but in real life, “I’m just a gentle Canadian.”

uwu im just a smollbean canadian

[–] [email protected] 19 points 9 months ago (4 children)

Jezos has defined effective accelerationism as “a memetic optimism virus,” “a meta-religion,” “a hypercognitive biohack,” “a form of spirituality,” and “not a cult.”

“It’s like it was meta-ironic and then became post-ironic.”

Jezos described the company as an “AI Manhattan Project” and once quipped, “If you knew what I was building, you’d try to ban it.”

“Our goal is really to increase the scope and scale of civilization as measured in terms of its energy production and consumption,” he said. Of the Jezos persona, he said: “If you're going to create an ideology in the time of social media, you’ve got to engineer it to be viral.”

Guillaume “BasedBeffJezos” Verdon appears, by all accounts, to be an utterly insufferable shithead with no redeeming qualities

[–] [email protected] 23 points 9 months ago* (last edited 9 months ago)

“Our goal is really to increase the scope and scale of civilization as measured in terms of its energy production and consumption,” h

old and busted: paperclip maximizer

new hotness: entropy maximizer

[–] [email protected] 17 points 9 months ago* (last edited 9 months ago)

you’ve got to engineer it to be viral.

All this attention including the whole Andreessen thing and he doesn't go above 50k followers. As far as virality goes that is pretty bad.

Also from his twitter: "We literally became a sufficient threat to the system that they felt compelled to attempt to neutralize me.

The thing is, I am a man of belief. You can take everything from me. I don't care. I am going to keep going until I'm dead.

You cannot stop acceleration."

Sure buddy the system did that. Keep riding the wave, in five years we will all gather on a steep hill in Las Vegas and look west.

[–] [email protected] 14 points 9 months ago* (last edited 9 months ago) (2 children)

imagine huffing your own farts this hard cause you came up with BasedBeffJezos and posted garbage on Twitter that was so out of touch that monstrous every billionaire instantly agreed with it

[–] [email protected] 16 points 9 months ago (1 children)

oh also, meta-ironic cult is also a term used by Remilia and some of the other Thiel death cults to describe themselves

[–] [email protected] 11 points 9 months ago

"to any feds reading my feed: jk jk"

[–] [email protected] 9 points 9 months ago

A fart-huff that hard qualifies as an Alvistime miracle.

[–] [email protected] 9 points 8 months ago (1 children)

“As measured in terms of its energy production and consumption”

That’s so extremely fucking insane, jesus. We are already dealing with those issues at the “low” end and they want to fucking accelerate it? Christ these people are the fucking worst

[–] [email protected] 10 points 8 months ago (3 children)

I had kind of the same thought. Woah, maximize long term energy production??? How novel, let's get our best people right on that, thanks for mentioning it, gosh didnt occur to anyone.

I wonder when it finally occurs to them that the monetary system is literally a proxy for energy production and consumption, and their entire philosophy might as well read: "make more $$$." I'll have to ask the stupid question again, what material difference is there between e/acc, ea, and delusion?

[–] [email protected] 10 points 8 months ago

There's a subtype of goldbugs that want "hard money" to be represented by something more universal than gold, like energy. It's why they convince themselves Bitcoin is worth something. Maybe this joker is one of them.

load more comments (2 replies)
[–] [email protected] 16 points 9 months ago (1 children)

In its reaction against both EA and AI safety advocates, e/acc also explicitly pays tribute to another longtime Silicon Valley idea. “This is very traditional libertarian right-wing hostility to regulation," said Benjamin Noys, a professor of critical theory at the University of Chichester and scholar of accelerationism. Jezos calls it the “libertarian e/acc path.”

At least the Italian futurists were up front about their agenda.

“We’re trying to solve culture by engineering,” Verdon said. “When you're an entrepreneur, you engineer ways to incentivize certain behaviors via gradients and reward, and you can program a civilizational system."

Reading Nudge to engineer the 'Volksschädling' to board the trains voluntarily. Dusting off the old state eugenics compensation programs.

[–] [email protected] 15 points 8 months ago (2 children)

The fuck do they mean "solve culture"? Is culture a problem to be solved? Actually don't answer that.

[–] [email protected] 10 points 8 months ago (2 children)

even more horrifying — they see culture as a system of equations they can use AI to generate solutions for, and the correct set of solutions will give them absolute control over culture. they apply this to all aspects of society. these assholes didn’t understand hitchhiker’s guide to the galaxy or any of the other sci fi they cribbed these ideas from, and it shows

[–] [email protected] 10 points 8 months ago (1 children)

It's like pickup artistry on a societal scale.

It really does illustrate the way they see culture not as, like, a beautiful evolving dynamic system that makes life worth living, but instead as a stupid game to be won or a nuisance getting in the way of their world domination efforts

[–] [email protected] 8 points 8 months ago (1 children)

remember that Yudkowsky's CEV idea was literally to analytically solve ethics

[–] [email protected] 8 points 8 months ago (3 children)

In an essay that somehow manages to offhandendly mention both evolutionary psychology and hentai anime in the same paragraph.

[–] [email protected] 7 points 8 months ago* (last edited 8 months ago) (1 children)

It's like when he wore a fedora and started talking about 4chan greentexts in his first major interview. He just cannot help himself.

P.S. The New York Times recently listed "internet philosopher" Eliezer Yudkowsky as one of the of the major figures in the modern AI movement, this is the picture they chose to use.

[–] [email protected] 8 points 8 months ago (2 children)

this is the picture they chose to use.

You may not like it, but this is what peak rationality looks like.

load more comments (2 replies)
load more comments (2 replies)
[–] [email protected] 9 points 8 months ago (1 children)

The ultimate STEMlord misunderstanding of culture; something absolutely rife in the Silicon Valley tech-sphere.

[–] [email protected] 10 points 8 months ago (2 children)

These dudes wouldn't recognize culture if unsafed its Browning and shot them in the kneecaps.

load more comments (2 replies)
[–] [email protected] 9 points 8 months ago (1 children)

Don’t have to have Culture War when you can just systemically deploy the exact culture you want right from the comfort of your prompt, amirite?!

(This is a shitpost idea but it’s probably halfway accurate, maybe modulo the prompt (but there will definitely be someone also trying that))

[–] [email protected] 8 points 8 months ago (1 children)
[–] [email protected] 8 points 8 months ago* (last edited 8 months ago)

The best use case for Urbit is marking its proponents as first up against the wall when the revolution comes.

[–] [email protected] 16 points 8 months ago (3 children)

HN discovers this article, almost a day later (laggards): https://news.ycombinator.com/item?id=38500192

A voice analysis conducted by Catalin Grigoras, Director of the National Center for Media Forensics, compared audio recordings of Jezos and talks given by Verdon

A particularly creepy doxxing by Forbes...

Oh no, are the tools developed by SV startups being used for stuff you don't like? How sad HN.

[–] [email protected] 13 points 8 months ago (1 children)

I don't want to libel the author by claiming the piece was planted by Beff himself, but what's more likely, this writer who mostly covers what's trending on tiktok gets a big scoop using voice recognition on a twitter space and then writes a glowing dossier? Or beff got on the phone with a publicist and conjured a big reveal with a softball interview all while namedropping his new startup?

extremely funny that they think this article makes him look good. also extremely funny that they think this is a big scoop

[–] [email protected] 8 points 8 months ago

@sc_griffith @gerikson
Buried in that piece is the probable typo but certainly pointed "On X, the platform formally known as Twitter"

[–] [email protected] 7 points 8 months ago

Oh no, are the tools developed by SV startups being used for stuff you don't like? How sad HN.

tormenting the torment nexus. heh, how the turntables

load more comments (1 replies)
[–] [email protected] 15 points 8 months ago (1 children)

former Google engineer.

Of course. At this point whenever I read something with the phrase former Google engineer I'm just gonna assume they're doing something terrible.

[–] [email protected] 7 points 8 months ago (2 children)

q: how do you know if someone's a former google engineer?

xoogler's everywhere, a: AT GOOGLE WE USED TO HAVE A WAY TO...

[–] [email protected] 8 points 8 months ago (1 children)

Ah fuck as a xoogler I do this. Everything I do is terrible (see my advent of code snippets) and I frequently refer to things that existed in the google ecosystem.

[–] [email protected] 8 points 8 months ago (3 children)

@swlabr As a ten year veteran of the SRE mines I’ve always tried really hard not to do this, but I did once leave a job partly as a result of the CTO justifying a decision with “But it says here in the SRE book that that’s the way they do this at Google!” and completely ignoring my protestations that god no, that certainly wasn’t how we did it at least in my bit of SRE.

[–] [email protected] 7 points 8 months ago (1 children)

Oh GOD that is kafkaesque. It's been too few jobs since leaving the G for that to happen to me yet, but I'm sure I'll get there one day.

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 15 points 9 months ago

my "not a cult" T-shirt has raised many questions, etc.

[–] [email protected] 11 points 8 months ago

That not a cult quote strategically placed after all the cultish babble quotes mwah perfect journalism

[–] rip_art_bell 9 points 8 months ago

Nerd rapture goofiness

load more comments
view more: next ›