blakestacey

joined 2 years ago
MODERATOR OF
[–] [email protected] 4 points 1 month ago* (last edited 1 month ago)

Throwback Thursday: Atlas Shrugged: The Cobra Commander Dialogues

(Based on blog posts now available here.)

[–] [email protected] 12 points 1 month ago (2 children)

From page 17:

Rather than encouraging critical thinking, in core EA the injunction to take unusual ideas seriously means taking one very specific set of unusual ideas seriously, and then providing increasingly convoluted philosophical justifications for why those particular ideas matter most.

ding ding ding

[–] [email protected] 9 points 1 month ago* (last edited 1 month ago) (3 children)

Abstract: This paper presents some of the initial empirical findings from a larger forth-coming study about Effective Altruism (EA). The purpose of presenting these findings disarticulated from the main study is to address a common misunderstanding in the public and academic consciousness about EA, recently pushed to the fore with the publication of EA movement co-founder Will MacAskill’s latest book, What We Owe the Future (WWOTF). Most people in the general public, media, and academia believe EA focuses on reducing global poverty through effective giving, and are struggling to understand EA’s seemingly sudden embrace of ‘longtermism’, futurism, artificial intelligence (AI), biotechnology, and ‘x-risk’ reduction. However, this agenda has been present in EA since its inception, where it was hidden in plain sight. From the very beginning, EA discourse operated on two levels, one for the general public and new recruits (focused on global poverty) and one for the core EA community (focused on the transhumanist agenda articulated by Nick Bostrom, Eliezer Yudkowsky, and others, centered on AI-safety/x-risk, now lumped under the banner of ‘longtermism’). The article’s aim is narrowly focused onpresenting rich qualitative data to make legible the distinction between public-facing EA and core EA.

[–] [email protected] 18 points 1 month ago* (last edited 1 month ago) (4 children)

From the linked Andrew Molitor item:

Why Extropic insists on talking about thermodynamics at all is a mystery, especially since “thermodynamic computing” is an established term that means something quite different from what Extropic is trying to do. This is one of several red flags.

I have a feeling this is related to wanking about physics in the e/acc holy gospels. They invoke thermodynamics the way that people trying to sell you healing crystals for your chakras invoke quantum mechanics.

[–] [email protected] 11 points 1 month ago (1 children)

They take a theory that is supposed to be about updating one's beliefs in the face of new evidence, and they use it as an excuse to never change what they think.

[–] [email protected] 10 points 1 month ago (1 children)

oof

The existence of a Wikipedia page for dinosaur erotica must prove that back in the days when humans co-existed with stegosaurs, the ones who fucked them lived better.

[–] [email protected] 6 points 1 month ago

Erin go Bleagh.

[–] [email protected] 17 points 1 month ago (2 children)

"Consider it from the perspective of someone who does not exist and therefore has no preferences. Who would they pick?"

[–] [email protected] 9 points 1 month ago

The idea that formalist experimentation and deliberately pushing the boundaries of a medium are only one of several goals to which art can strive is apparently too sophisticated for Scott Adderall. He also takes a leap from "influential" to "meaningful", an elision so hackish it's trite.

[–] [email protected] 8 points 1 month ago

It's a very "steampunk (derogatory)" picture, like something I would have found on sale in the artist room of the science-fiction convention that convinced me I don't like science-fiction conventions very much.

[–] [email protected] 3 points 1 month ago (3 children)

Looks like a good start, at least.

view more: ‹ prev next ›