this post was submitted on 05 Feb 2024
117 points (89.8% liked)

Technology

59711 readers
5618 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The New Luddites Aren’t Backing Down::Activists are organizing to combat generative AI and other technologies—and reclaiming a misunderstood label in the process.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 9 months ago (1 children)

However I’m not afraid of it taking my job because someone still needs to tell it what to do

Why couldn't it do that part too? - purely based on a simple high-level objective that anyone can formulate. Which part exactly do you think is AI-resistant?

I'm not talking about today's models, but more like 5-10 years into the future.

[–] anlumo 1 points 9 months ago (1 children)

That’s what I’ve been arguing with a fellow programmer recently. Right now you have to tell these programmer LLMs what to do on a function-by-function basis, because it doesn’t have enough capacity to think on a project level. However, that’s exactly what can be improved by scaling the neural network up. Right now the LLMs are limited by hardware, but they’re still using off-the-shelf GPUs that were designed for a completely different use case. The accelerators designed for AI are currently in the preproduction phase, very close to getting used in the AI data centers.

[–] [email protected] 4 points 9 months ago (1 children)

Yeah I've seen a lot of weird takes on AI. It all seems to come down to ego guarding: But it can't take my job, it just regurgitates combinations of what it was taught unlike me, only humans can be creative, who wants coffee made by a machine, well you still need a person to do things in the physical world, etc.. Really highlights how difficult it is for people to think about change. Especially a change that might not end with a place for them.

[–] anlumo 3 points 9 months ago (1 children)

The creativity argument I don't get at all. Being creative these days means taking a bunch of known ideas and mashing them up, and that's exactly what an LLM does. Very few people can really think outside the box.

I've had a few things where it was actually the other way around. I'm running a lot of TTRPGs, and my storylines are always pretty bland because I'm not that creative. I've started to use ChatGPT4 to give me a few ideas for stories, and it helps me break out of that box by suggesting completely different things than what I'd have thought of.

[–] [email protected] 2 points 9 months ago

I'll argue it's always been that way. It's Just that the pool of data that people are pulling from these days is more homogeneous. It used to be that people had a lot more unique and personal experiences that weren't known to the world. But today everything is shared and given a label by our culture. So if you come up with an idea it's much more likely that someone that has had similar experiences to you, thought of it already. People say there's no more new ideas. Maybe that's true in a sense, but I'd argue nothing's changed except that people know about all the ideas.