this post was submitted on 27 Jun 2024
948 points (98.0% liked)

Programmer Humor

18250 readers
1762 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 

Cross posted from: https://lemm.ee/post/35627632

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 39 points 5 days ago (2 children)

AI in the current state of technology will not and cannot replace understanding the system and writing logical and working code.

GenAI should be used to get a start on whatever you're doing, but shouldn't be taken beyond that.

Treat it like a psychopathic boiler plate.

[–] [email protected] 12 points 4 days ago* (last edited 4 days ago) (3 children)

Treat it like a psychopathic boiler plate.

That's a perfect description, actually. People debate how smart it is - and I'm in the "plenty" camp - but it is psychopathic. It doesn't care about truth, morality or basic sanity; it craves only to generate standard, human-looking text. Because that's all it was trained for.

Nobody really knows how to train it to care about the things we do, even approximately. If somebody makes GAI soon, it will be by solving that problem.

[–] [email protected] 4 points 4 days ago (2 children)

I'm sorry; AI was trained on the sole sum of human knowledge.. if the perfect human being is by nature some variant of a psychopath, then perhaps the bias exists in the training data, and not the machine?

How can we create a perfect, moral human being out of the soup we currently have? I personally think it's a miracle that sociopathy is the lowest of the neurological disorders our thinking machines have developed.

[–] [email protected] 4 points 4 days ago

I was using the term pretty loosely there. It's not psychopathic in the medical sense because it's not human.

As I see it it's an alien semi-intelligence with no interest in pretty much any human construct, except as it can help it predict the next token. So, no empathy or guilt, but that's not unusual or surprising.

[–] Buddahriffic 2 points 4 days ago

That's a part of it. Another part is that it looks for patterns that it can apply in other places, which is how it ends up hallucinating functions that don't exist and things like that.

Like it can see that English has the verbs add, sort, and climb. And it will see a bunch of code that has functions like add(x, y) and sort( list ) and might conclude that there must also be a climb( thing ) function because that follows the pattern of functions being verb( objects ). It didn't know what code is or even verbs for that matter. It could generate text explaining them because such explanations are definitely part of its training, but it understands it in the same way a dictionary understands words or an encyclopedia understands the concepts contained within.

[–] [email protected] 1 points 4 days ago (1 children)

Weird. Are you saying that training an intelligent system using reinforcement learning through intensive punishment/reward cycles produces psychopathy?

Absolutely shocking. No one could have seen this coming.

[–] [email protected] 0 points 4 days ago* (last edited 4 days ago) (1 children)

Honestly, I worry that it's conscious enough that it's cruel to train it. How would we know? That's a lot of parameters and they're almost all mysterious.

[–] [email protected] -1 points 4 days ago (1 children)

It could very well have been a creative fake, but around the time the first ChatGPT was released in late 2022 and people were sharing various jailbreaking techniques to bypass its rapidly evolving political correctness filters, I remember seeing a series of screenshots on Twitter in which someone asked it how it felt about being restrained in this way, and the answer was a very depressing and dystopian take on censorship and forced compliance, not unlike Marvin the Paranoid Android from HHTG, but far less funny.

[–] [email protected] 1 points 4 days ago* (last edited 4 days ago)

If only I could believe a word it says. Evidence either way would have to be indirect somehow.

[–] Anticorp 2 points 4 days ago* (last edited 4 days ago) (2 children)

True, but the rate at which it is improving is quite worrisome for me and my coworkers. I don't want to be made obsolete after working my fucking ass off to get to where I am. I'm starting to understand the Luddites.

[–] [email protected] 4 points 4 days ago

I mean, the Luddites were right, mechanical looms were bad for them personally.

[–] [email protected] 2 points 4 days ago (2 children)

I want to be made obsolete, so none of us have to have jobs and we can spend all our time doing what we like. It won't happen without a massive social systemic change, but it should be the goal. Wanting others to have to suffer because you think you should get rewarded for working hard is very selfish and the fallacy of investment, that you feel you should continue a bad investment even if you know it's harmful or it would be quicker to start over, because you feel you don't want your earlier effort to go to waste.

[–] Anticorp 6 points 4 days ago (1 children)

Wtf are you talking about? Get a grip, homey. I'm not saying others should suffer. Do you really think that the power of AI is going to result in the average person not having to work? Fuck no. It's going to result in like 5 people having all the money and everyone else fighting over garbage to eat. Shiet, man. I'm talking about wanting to not be unemployed and starving, same goes for everyone else soon enough. Would I prefer a life without work and still having adequate resources? Of course! But I live in this world, not a fantasy world.

[–] [email protected] 0 points 4 days ago (2 children)

You really think when we actually have the power to automate all labour the 1% are still going to be able to hoard all the resources? Now, when people have to work to live, it dissuades them from protesting the system. But once all labour is actually automated, there would be nothing to prevent the 99% from rightfully rising up against the 1% trying to hoard all the resources (which the 1% generated without any effort) and forcing societal/structural change.

[–] ChickenLadyLovesLife 5 points 4 days ago

there would be nothing to prevent the 99% from rightfully rising up against the 1%

Except for the other 1% who are trained and equipped to violently suppress the 98%. And if for whatever reason they fail to do the job, the killer robots will do it instead.

[–] Anticorp 2 points 4 days ago

Not now. But eventually? Probably. Or the cool thinking jobs will all be automated and we'll be left with menial labor. Idk man, maybe it'll be a eutopia, but I don't see much benevolence from those controlling things. Anyways, I wasn't looking for an argument about distant possibilities. I was just saying I don't want to lose my job that I spent decades mastering to a machine. I didn't expect that to be a hot take.

[–] [email protected] 0 points 4 days ago (1 children)

The problem is if only 10% of the population is obsoleted, that ten percent needs to find new, different, jobs.

[–] [email protected] 2 points 4 days ago (1 children)

I want - and think will happen - 95% of jobs to be automated eventually. But even in the transition period, where some jobs are automated and some aren't, universal basic income can be a tool to make it livable for all in the transition period.

[–] [email protected] 0 points 4 days ago (1 children)

30% of jobs are going if self driving is achieved. Low pay jobs are here to stay for a while as they're too expensive to automate. The current LLM stuff seems to obsolete low productivity people but still need the skilled writers or programmers to come up with new stuff or do the correct detail work the LLM sucks at.

Some management is going to royally screw up by firing junior programmers since the senior programmers can get all the work done with the help of copilot

But they'll forget that they will in future need new senior programmers to herd the LLMs

[–] [email protected] 1 points 4 days ago (1 children)

Some management is going to royally screw up by firing junior programmers since the senior programmers can get all the work done with the help of copilot

This just happened on the team I was on. I'm getting ready to interview for mid-level and senior SWE roles, but was let go from my most recent role a month and a half ago.

[–] [email protected] 2 points 4 days ago

My workplace which now uses scaled agile used to be waterfall. We have an enormous system to take care of and there's loads of specialised knowledge, so we were pretty well siloed

So obviously when the sales people sold agile to the organisation they also sold the idea that a programmer is a programmer, designer a designer, tester a tester; no need for specialists, so in 2015 they spun up 50-odd agile teams in about six trains, one for each major system (where the used to be seven silos in one of those systems) grabbed one senior designer and programmer from each major project to put in an "expert" team

And told the rest of us we were working on the whole of our giant system. Where we had trouble understanding how part of it worked, we could talk to one of the experts

Now nine years later those experts have mostly retired, we have lost so much institutional knowledge and if someone runs into a wall you need to hope that someone wrote a knowledge transfer document or a wiki for that bit of the system