this post was submitted on 07 Apr 2024
339 points (93.1% liked)

Technology

59593 readers
6512 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hansl -5 points 7 months ago* (last edited 7 months ago) (3 children)

You’ll get blindsided real quick. AIs are just getting better. OpenAI are already saying they moved past GPT for their next models. It’s not 5 years before it can fix code longer than 400 lines, and not 20 before it can digest a specification and spout a working software. Said software might not be optimized or pretty, but those are things people can work separately. Where you needed 20 software engineers, you’ll need 10, then 5, then 1-2.

You have more in common with the guy getting replaced today than you care to admit in your comment.

Edit: not sure why I’m getting downvoted instead of having a discussion, but good luck to you all in your careers.

[–] [email protected] 7 points 7 months ago (1 children)

i didn't downvote you, regardless internet points don't matter.

you're not wrong, and i largely agree with what you've said, because i didn't actually say a lot of the things your comment assumes.

the most efficient way i can describe what i mean is this:

LLMs (this is NOT AI) can, and will, replace more and more of us. however, there will never, ever be a time where there will be no human overseeing it because we design software for humans (generally), not for machines. this requires integral human knowledge, assumptions, intuition, etc.

[–] hansl 1 points 7 months ago* (last edited 7 months ago) (1 children)

LLMs (this is NOT AI)

I disagree. When I was studying AI at college 20+ years ago we were also talking about expert systems which are glorified if/else chains. Most experts in the field agree that those systems can also be considered AI (not ML though).

You may be thinking of GAI or Universal AI which is different. I am a believer in the singularity (that a machine will be as creative and conscious as a human), but that’s a matter of opinion.

I didn’t downvote you

I was using “you” more towards the people downvoting me, not you directly. You can see the accounts who downvoted/upvoted, btw.

Edit: and I assumed the implication of your comment was that “people who code are safe”, which is a stretch I was answering to. Your comment was ambiguous either way.

[–] [email protected] 3 points 7 months ago (2 children)

jesus christ you should be shoved into a locker

[–] [email protected] 0 points 7 months ago
[–] hansl 0 points 7 months ago (1 children)

Wow. Thanks for the advice. I guess that’s just Lemmy showing me the door. Good luck with your community here.

[–] [email protected] 3 points 7 months ago

Try not to let the bot hurt your feelings, it was trained on cunts 'n' assholes

[–] aesthelete 5 points 7 months ago* (last edited 7 months ago)

Where you needed 20 software engineers, you’ll need 10, then 5, then 1-2.

It's an open secret that this is already the case. I have seen projects that went on for decades and only required the engineering staff they had because corporate bureaucracy and risk aversion makes everyone a fraction as effective as they could be, and, frankly, because a lot of ineffective morons got into software development because of the $$$ they could make.

Unless AI somehow eliminates corporate overhead I don't understand how it'll possibly make commercial development monumentally easier.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago) (1 children)

Yeah people think AI is what sci-fi movies sold them. Hyper intelligent - hyper aware sentient beings capable of love or blah blah blah. We'll get there, but corps don't need that. In fact that's the part they don't want. They need a mindless drone to replace the 80% of their workers doing brainless jobs.

[–] aesthelete 2 points 7 months ago (1 children)

They need a mindless drone to replace the 80% of their workers doing brainless jobs.

Yeah the problem there is that they don't know their own staff enough to know who are the people doing brainless jobs.

[–] [email protected] 1 points 7 months ago (1 children)

I've worked office jobs at a few large corporations. I've noticed they like to lay off a department, see how long the other departments can get by splitting up the work, then when everything is on fire they open up hiring. But every now and then... they let go of a department and everything just keeps working. It's a strategy that seems to work, unfortunately.

[–] aesthelete 1 points 7 months ago

Sounds like my current job.