this post was submitted on 09 Sep 2024
36 points (100.0% liked)
TechTakes
1483 readers
238 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Honestly I'm kind of reminded of some of the philosophy around semiotics and authorship. Like, when reading a story part of the interpretation comes from constructing a mental image of the author talking to a mental image of the audience, and the way those mental images get created can color the interpretation and how we read and understand the text.
In that sense, the tendency to construct a mental image of a person talking through ChatGPT or Eliza makes much more sense. I've been following the Alex Jones interviews of chatGPT and the illusion is much less strong when listening to the conversation rather than having it mediated through text, which is probably a good sign for those of us who like actual people. Even when interactive, chatting through text is sufficiently less personal that it's easier to fill in all the extra humanity, though as we see from Alex himself in those interviews it is definitely not impossible to get fooled through other media.
But that's at the ground level of interaction, and it's probably noteworthy that the press releases for all these policies are not getting written by a bot. This tendency to fill in a human being definitely lines up with the tech-authoritarian tendency that OP has discussed elsewhere to dehumanize both their victims and more significantly themselves. I think the way they talk about themselves and the people who work on their "side" is if anything more alarming than the way they talk about their victims.