this post was submitted on 24 Feb 2025
474 points (99.8% liked)

Gaming

2822 readers
421 users here now

The Lemmy.zip Gaming Community

For news, discussions and memes!


Community Rules

This community follows the Lemmy.zip Instance rules, with the inclusion of the following rule:

You can see Lemmy.zip's rules by going to our Code of Conduct.

What to Expect in Our Code of Conduct:


If you enjoy reading legal stuff, you can check it all out at legal.lemmy.zip.


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 1 day ago (2 children)

This might be a little off-topic, but I've noticed what seems to be a trend of anti-AI discourse ignoring programmers. Protect artists, writers, animators, actors, voice-actors... programmers, who? No idea if it's because they're partly to blame, or people are simply unaware code is also stolen by AI companies—still waiting on that GitHub Copilot lawsuit—but the end result appears to be a general lack of care about GenAI in coding.

[–] [email protected] 1 points 8 hours ago

LLMs are going to make senior devs indespensable. So far from what I've seen, it's not great at solving unusual cases, and it most shines in boilerplate and generic problems.

So juniors are never going to learn to code, and then companies will have to pay for experienced people.

Juniors never think hard about unionizing, and the seniors will have job security and therefore not strong motivation.

I hope devs will unionize in any case, LLMs or not, like any other specialization.

[–] [email protected] 2 points 1 day ago (3 children)

I think it's because most programmers use and appreciate the tool. This might change once programmers start to blame gen AI for not having a job anymore.

[–] [email protected] 2 points 12 hours ago* (last edited 12 hours ago)

And programmers retain complete control of the output - it's just a bit of text that you can adapt as needed. Same as looking up snippets from Stack Overflow. Programmers are used to finding some snippet, checking if it actually works, and then adapting it to the rest of their code, so if doesn't feel like introducing media that you didn't create, but like a faster version of what everyone was already doing.

[–] [email protected] 3 points 22 hours ago

I noticed a bad trend with my colleagues who use copilot, chatgpt etc. They not only use it to write code, but also trust it with generally poor design decisions.

Another thing is that those people also hate working on existing code, claiming it is communicated and offering to write their (which also ends up complicated) version of it. I suspect it's because copilot doesn't help as much when code is more mature.

[–] [email protected] 6 points 1 day ago* (last edited 1 day ago) (1 children)

There remains a significant enclave that rejects it, but yeah, it's definitely smaller than equivalent groups in other mentioned professions. Hopefully things won't get that far. I think the tech is amazing, but it's an immense shame that so many of my/our peers don't give a flying fuck about ethics.

[–] [email protected] 6 points 1 day ago

There remains a significant enclave that rejects it, but yeah, it's definitely smaller than equivalent groups in other mentioned professions.

Reporting in.

I think the tech is amazing, but it's an immense shame that so many of my/our peers don't give a flying fuck about ethics.

Yup. Very much agreed here. There are some uses that are acceptable but it's a but hard to say that any are ethical due to the ethically bankrupt foundations of its training data.