this post was submitted on 30 Dec 2024
653 points (98.5% liked)
Programmer Humor
19817 readers
137 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
People thought COBOL would let managers write code.
What.
It's by design very verbose and "English"-like, like instead of x=y*z it would go "MULTIPLY y BY z GIVING x", the idea was that it would read almost like natural language, so that non-tech staff could understand it.
Except that it's not the syntax that makes programming hard, it's the thought process, right?
Yes. COBOL can be excused because it was the first time anyone was going down that path. Everything that comes later, less so.
Yeah. And a lot of non-programmers became programmers thanks to Cobol.
I think we're seeing this effect with AI code copilots, as well. It can't replace a programmer, but it can elevate a potential programmer to proficient (in at least some tasks) faster than was possible before.
I know it theoretically means I earn less than I might have, but for my whole career there's been so much more to be done than there are of us to do it, anyway.
Yeah. They really need to get off my lawn with this nonsense. We've seen this enough times to know that it'll be great, but still won't solve all our problems.
i mean syntax is part of it, but it can only help you so much. like no matter how you talk about mathematics, you have to somehow understand what multiplication is, but it certainly does help to write "5x3" rather than "5+5+5"
But/and also also, just because you might know what a multiplication is you still might not know how to use that to make audio louder! (You might say "well just add to the loudness!" or if you actually know it's not that easy you might say "just multiply it by 2!", but the computer doesn't simply take "audio" it takes some form of bytes in an array encoding said audio, let's say PCM to make it easier, and you still need to know how to loop over that and multiply every value by 2 to double the physical volume.)
oh but don't forget clipping and the fact that you now increased audio variance means the 10¢ tinny speakers at checkout cant power it, so now you have to work around perceptive loudness and normalize to speech frequencies and when you get to the shop to install new firmware you see a granny wearing glasses asking "what does the self checkout menu say?" and now you have a new problem.
Man alive, don't get the managers working with audio. "Doubling the stream" might work if you're using a signed audio format rather than an unsigned one, and the format is in the same endianness as the host computer uses. Neither of which are guaranteed when working with audio.
But of course, the ear perceives loudness in a logarithmic way (the decibel scale), so for it to be perceived as "twice as loud", it generally needs an exponential increase. Very high and low frequencies need more, since we're less sensitive to them and don't perceive increases so well.
Exactly!
And, of course, AI doesn't help with the thought process at all, but did made the syntax much simpler to deal with, once again.
So - once again - people who don't understand what you just pointed out, now believe we don't need programmers anymore. Just like the last several times that we "didn't need programmers anymore", for basically the same reason.
I understand that we rinse and repeat the same nonsense for networking, systems administrator, etc, every few years. Some people genuinely believe that the computers are someday going to magically start understanding themselves for us.