this post was submitted on 17 Mar 2025
1361 points (99.7% liked)

Programmer Humor

34463 readers
66 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 12 points 1 day ago (1 children)

Ok, but what did they try to do as a SaaS?

[–] [email protected] 4 points 1 day ago
[–] [email protected] 4 points 1 day ago (2 children)

Devils advocate, not my actual opinion; if you can make a Thing that people will pay to use, easily and without domain specific knowledge, why would you not? It may hit issues at some point but by them you’ve already got ARR and might be able to sell it.

[–] [email protected] 17 points 1 day ago

If you started from first principles and made a car or, in this case, told an flailing intelligence precursor to make a car, how long would it take for it to create ABS? Seatbelts? Airbags? Reinforced fuel tanks? Firewalls? Collision avoidance? OBD ports? Handsfree kits? Side impact bars? Cupholders? Those are things created as a result of problems that Karl Benz couldn't have conceived of, let alone solve.

Experts don't just have skills, they have experience. The more esoteric the challenge, the more important that experience is. Without that experience you'll very quickly find your product fails due to long-solved problems leaving you - and your customers - in the position of being exposed dangers that a reasonable person would conclude shouldn't exist.

[–] [email protected] 3 points 1 day ago* (last edited 1 day ago)

Yeh, arguably and to a limited extent, the problems he's having now aren't the result of the decision to use AI to make his product so much as the decision to tell people about that and people deliberately attempting to sabotage it. I'm careful to qualify that though because the self evident flaw in his plan even if it only surfaced in a rather extreme scenario, is that he lacks the domain specific knowledge to actually make his product work as soon as anything becomes more complicated than just collecting the money. Evidently there was more to this venture than just the building of the software, that was necessary to for it to be a viable service. Much like if you consider yourself the ideas man and paid a programmer to engineer the product for you and then fired them straight after without hiring anyone to maintain it or keep the infrastructure going or provide support for your clients and then claimed you 'built' the product, you'd be in a similar scenario not long after your first paying customer finds out the hard way that you don't actually know anything about your own service that you willingly took money for. He's discovering he can't actually provide the service part of the Software as a Service he's selling.

[–] dojan 27 points 2 days ago (1 children)

Was listening to my go-to podcast during morning walkies with my dog. They brought up an example where some couple was using ShatGPT as a couple's therapist, and what a great idea that was. Talking about how one of the podcasters has more of a friend like relationship to "their" GPT.

I usually find this podcast quite entertaining, but this just got me depressed.

ChatGPT is by the same company that stole Scarlett Johansson's voice. The same vein of companies that thinks it's perfectly okay to pirate 81 terabytes of books, despite definitely being able to afford paying the authors. I don't see a reality where it's ethical or indicative of good judgement to trust a product from any of these companies with information.

[–] Bazoogle 10 points 2 days ago (1 children)

I agree with you, but I do wish a lot of conservatives used chatGPT or other AI's more. It, at the very least, will tell them all the batshit stuff they believe is wrong and clear up a lot of the blatant misinformation. With time, will more batshit AI's be released to reinforce their current ideas? Yea. But ChatGPT is trained on enough (granted, stolen) data that it isn't prone to retelling the conspiracy theories. Sure, it will lie to you and make shit up when you get into niche technical subjects, or ask it to do basic counting, but it certainly wouldn't say Ukraine started the war.

load more comments (1 replies)
[–] Treczoks 48 points 2 days ago (1 children)

That is the future of AI written code: Broken beyond comprehension.

[–] [email protected] 17 points 2 days ago* (last edited 2 days ago)

Ooh is that job security I hear????

[–] [email protected] 36 points 2 days ago (1 children)

I took a web dev boot camp. If I were to use AI I would use it as a tool and not the motherfucking builder! AI gets even basic math equations wrong!

[–] [email protected] 5 points 1 day ago (1 children)

Can't expect predictive text to be able to do math. You can get it to use a programming language to do it tho. If you ask it in a programmatic way it'll generate and run it's own code. Only way I got it to count the amount of r's in strawrbrerry.

[–] [email protected] 4 points 1 day ago (2 children)

I love strawrbrerry mllilkshakes.

[–] [email protected] 1 points 1 day ago

At the liberry!

[–] [email protected] 1 points 1 day ago

Way better than wrasprbrerry

[–] bitjunkie 26 points 2 days ago

AI can be incredibly useful, but you still need someone with the expertise to verify its output.

[–] [email protected] 290 points 3 days ago (2 children)

Bonus points if the attackers use ai to script their attacks, too. We can fully automate the SaaS cycle!

[–] [email protected] 117 points 3 days ago (4 children)

That is the real dead Internet theory: everything from production to malicious actors to end users are all ai scripts wasting electricity and hardware resources for the benefit of no human.

[–] [email protected] 47 points 3 days ago (10 children)

Seems like a fitting end to the internet, imo. Or the recipe for the Singularity.

load more comments (10 replies)
load more comments (3 replies)
load more comments (1 replies)
[–] [email protected] 138 points 2 days ago (3 children)

Hilarious and true.

last week some new up and coming coder was showing me their tons and tons of sites made with the help of chatGPT. They all look great on the front end. So I tried to use one. Error. Tried to use another. Error. Mentioned the errors and they brushed it off. I am 99% sure they do not have the coding experience to fix the errors. I politely disconnected from them at that point.

What's worse is when a noncoder asks me, a coder, to look over and fix their ai generated code. My response is "no, but if you set aside an hour I will teach you how HTML works so you can fix it yourself." Never has one of these kids asking ai to code things accepted which, to me, means they aren't worth my time. Don't let them use you like that. You aren't another tool they can combine with ai to generate things correctly without having to learn things themselves.

[–] [email protected] 61 points 2 days ago

100% this. I've gotten to where when people try and rope me into their new million dollar app idea I tell them that there are fantastic resources online to teach yourself to do everything they need. I offer to help them find those resources and even help when they get stuck. I've probably done this dozens of times by now. No bites yet. All those millions wasted...

[–] [email protected] 1 points 1 day ago

Coder? You havent been to university right?

[–] [email protected] 28 points 2 days ago

I've been a professional full stack dev for 15 years and dabbled for years before that - I can absolutely code and know what I'm doing (and have used cursor and just deleted most of what it made for me when I let it run)

But my frontends have never looked better.

[–] [email protected] 37 points 2 days ago (1 children)

This feels like the modern version of those people who gave out the numbers on their credit cards back in the 2000s and would freak out when their bank accounts got drained.

[–] [email protected] 109 points 2 days ago (2 children)

Ha, you fools still pay for doors and locks? My house is now 100% done with fake locks and doors, they are so much lighter and easier to install.

Wait! why am I always getting robbed lately, it can not be my fake locks and doors! It has to be weirdos online following what I do.

[–] MisterFrog 5 points 1 day ago

The difference is locks on doors truly are just security theatre in most cases.

Unless you're the BiLock and it takes the LockPickingLawyer 3 minutes to pick it open.

https://m.youtube.com/watch?v=f5uk6C1iDkQ

load more comments (1 replies)
[–] [email protected] 165 points 3 days ago (6 children)

AI is yet another technology that enables morons to think they can cut out the middleman of programming staff, only to very quickly realise that we're more than just monkeys with typewriters.

[–] [email protected] 73 points 3 days ago (4 children)

Yeah! I have two typewriters!

load more comments (4 replies)
load more comments (5 replies)
[–] [email protected] 49 points 2 days ago (1 children)

Yes, yes there are weird people out there. That's the whole point of having humans able to understand the code be able to correct it.

[–] [email protected] 33 points 2 days ago (2 children)

Chatgpt make this code secure against weird people trying to crash and exploit it ot

[–] Little8Lost 23 points 2 days ago* (last edited 2 days ago)

beep boop
fixed 3 bugs
added 2 known vulnerabilities
added 3 race conditions
boop beeb

load more comments (1 replies)
[–] [email protected] 37 points 2 days ago

I hope this is satire 😭

[–] [email protected] 57 points 2 days ago (17 children)

The fact that “AI” hallucinates so extensively and gratuitously just means that the only way it can benefit software development is as a gaggle of coked-up juniors making a senior incapable of working on their own stuff because they’re constantly in janitorial mode.

[–] [email protected] 16 points 2 days ago* (last edited 2 days ago) (1 children)

Plenty of good programmers use AI extensively while working. Me included.

Mostly as an advance autocomplete, template builder or documentation parser.

You obviously need to be good at it so you can see at a glance if the written code is good or if it's bullshit. But if you are good it can really speed things up without any risk as you will only copy cody that you know is good and discard the bullshit.

Obviously you cannot develop without programming knowledge, but with programming knowledge is just another tool.

[–] Nalivai 12 points 2 days ago (2 children)

I maintain strong conviction that if a good programmer uses llm in their work, they just add more work for themselves, and if less than good one does it, they add new exciting and difficult to find bugs, while maintaining false confidence in their code and themselves.
I have seen so much code that looks good on first, second, and third glance, but actually is full of shit, and I was able to find that shit by doing external validation like talking to the dev or brainstorming the ways to test it, the things you categorically cannot do with unreliable random words generator.

load more comments (2 replies)
[–] [email protected] 25 points 2 days ago (1 children)

So no change to how it was before then

load more comments (1 replies)
load more comments (15 replies)
[–] rtxn 105 points 3 days ago

"If you don't have organic intelligence at home, store-bought is fine." - leo (probably)

[–] [email protected] 60 points 3 days ago* (last edited 3 days ago)
[–] electric 64 points 3 days ago (7 children)

Is the implication that he made a super insecure program and left the token for his AI thing in the code as well? Or is he actually being hacked because others are coping?

[–] grue 157 points 3 days ago (3 children)

Nobody knows. Literally nobody, including him, because he doesn't understand the code!

[–] [email protected] 46 points 3 days ago (3 children)

Nah the people doing the pro bono pen testing know. At least for the frontend side and maybe some of the backend.

load more comments (3 replies)
load more comments (2 replies)
load more comments (6 replies)
[–] Charlxmagne 32 points 2 days ago (1 children)

This is what happens when you don't know what your own code does, you lose the ability to manage it, that is precisely why AI won't take programmer's jobs.

[–] [email protected] 34 points 2 days ago (3 children)

I don’t need ai to not know what my code does

load more comments (3 replies)
[–] [email protected] 11 points 2 days ago

Holy crap, it’s real!

[–] satans_methpipe 22 points 2 days ago

Eat my SaaS

[–] Takumidesh 32 points 2 days ago (18 children)

This is satire / trolling for sure.

LLMs aren't really at the point where they can spit out an entire program, including handling deployment, environments, etc. without human intervention.

If this person is 'not technical' they wouldn't have been able to successfully deploy and interconnect all of the pieces needed.

The AI may have been able to spit out snippets, and those snippets may be very useful, but where it stands, it's just not going to be able to, with no human supervision/overrides, write the software, stand up the DB, and deploy all of the services needed. With human guidance sure, but with out someone holding the AIs hand it just won't happen (remember this person is 'not technical')

[–] [email protected] 29 points 2 days ago (1 children)

idk ive seen some crazy complicated stuff woven together by people who cant code. I've got a friend who has no job and is trying to make a living off coding while, for 15+ years being totally unable to learn coding. Some of the things they make are surprisingly complex. Tho also, and the person mentioned here may do similarly, they don't ONLY use ai. They use Github alot too. They make nearly nothing themself, but go thru github and basically combine large chunks of code others have made with ai generated code. Somehow they do it well enough to have done things with servers, cryptocurrency, etc... all the while not knowing any coding language.

load more comments (1 replies)
load more comments (17 replies)
load more comments
view more: next ›