this post was submitted on 27 May 2024
26 points (96.4% liked)

FreeAssembly

75 readers
1 users here now

this is FreeAssembly, a non-toxic design, programming, and art collective. post your share-alike (CC SA, GPL, BSD, or similar) projects here! collaboration is welcome, and mutual education is too.

in brief, this community is the awful.systems answer to Hacker News. read this article for a solid summary of why having a less toxic collaborative community is important from a technical standpoint in addition to a social one.

some posting guidelines apply in addition to the typical awful.systems stuff:

(logo credit, with modifications by @[email protected])

founded 7 months ago
MODERATORS
 

this is AI but it felt a lot more guy with broken gear

top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 18 points 5 months ago (4 children)

While I agree mostly with the blunt of the thesis - 80% of the job is reading bad code and unfucking it, and ChatGPT sucks in all the ways - I disagree with the conclusions.

First, gen AI shifting us towards analysing more bad code to unfuck is not a good thing. It's quite specifically bad. We really don't need more bad code generators. What we need are good docs, slapping genAI as a band-aid for badly documented libraries will do more harm than good. The absolute last thing I want is genAI feeding me with more bullshit to deal with.

Second, this all comes across as an industrialist view on education. I'm sure Big Tech would very much like people to just be good at fixing and maintaining their legacy software, or shipping new bland products as quick as possible, but that's not why we should be giving people a CS education. You already need investigation skills to debug your own code. That 90% of industry work is not creative building of new amazing software doesn't at all mean education should lean that way. 90% of industry jobs don't require novel applications of algebra or analytical geometry either, and people have been complaining that "school teaches you useless things like algebra or trigonometry" for ages.

This infiltration of industry into academia is always a deleterious influence, and genAI is a great illustration of that. We now have Big Tech weirdos giving keynotes on CS conferences about how everyone should work in AI because it's The Future™. Because education is perpetually underfunded, it heavily depends on industry money. But the tech industry is an infinite growth machine; it doesn't care about any philosophical considerations with regards to education; it doesn't care about science in any way other than as a product to be packaged and shipped ASAP to grow revenue, doesn't matter if it's actually good, useful, sustainable, or anything like that. They invested billions into growing a specialised sector of CS with novel hardware and all (see TPUs) to be able to multiply matrices really fast, and the chief uses of that are Facebook's ad recommendation system and now ChatGPT.

This central conclusion just sucks from my perspective:

It’s how human programmers, increasingly, add value.

“Figure out why the code we already have isn’t doing the thing, or is doing the weird thing, and how to bring the code more into line with the things we want it to do.”

While yes, this is why even a "run-of-the-mill" job as a programmer is not likely to be outsourced to an ML model, that's definitely not we should aspire the value added to be. People add value because they are creative builders! You don't need a higher education to be able to patch up garbage codebases all week, the same way you don't need any algebra or trigonometry to work at a random paper-pushing job. What you do need it to is to become the person that writes the existing code in the first place. There's a reason these are Computer Science programmes and not "Programming @ Big Tech" programmes.

[–] [email protected] 10 points 5 months ago (1 children)

It didn't read to me like she was a fan of this shit at all, but was despairing of it and looking for ways to teach actual competence despite it

[–] [email protected] 7 points 5 months ago* (last edited 5 months ago) (2 children)

I'm probably projecting a baggage of dozens of conversations with people that unironically argue that a CS university should prepare you for working in industry as a programmer, but that's because I can't really discern the author's perspective on this from the text.

In either case,

to teach actual competence despite it

I think my point is that "competent programmer" as viewed by the industry is a vastly different thing than a "competent computer scientist" in a philosophical sense. Computer science really struggles with this because many things require both being a good engineer and a good scientist? For an analogy, an electric engineer and a physicist specialising in electrical circuits are two vastly different professions, and you don't need to know what an electron is to do the first. Whereas in computer science, like, you can't build a compiler without knowing your shit both around software engineering and theoretical concepts.

Let me also add that I think I never wrote a post where I would more like people to come and disagree with me. I might be very well talking some bullshit based on my vibes here, since all of this is basically vibes from mingling around with both industry and academia people...

[–] [email protected] 10 points 5 months ago* (last edited 5 months ago)

I mean, that's a problem with the field. Most of your work will be dredging the maintenance sewers, but also you will need to know the computer science, at least to be able to spot an O(n^2) in the wild.

(the sweet spot of algorithmic complexity, so easy to get away with when n is small so you fill your codebase with them, and so certain to fuck you up the moment n gets large)

[–] [email protected] 9 points 5 months ago (1 children)

If you keep in the mind the original angst of the students “I have to learn how to use LLMs or I’ll get left behind” they themselves have a vocational understanding of their degree. And it is sensible to address those concerns practically (though as stated in another comment, I don’t believe in accepting the default use of generative tools).

On a more philosophical note I think STEM fields (and any really general well-rounded education) would benefit from delving (!) deeper in library science/archival science/philosophy and their application to history, and that coincidentally that would make a lot of people better at troubleshooting and legacy code untangling.

[–] [email protected] 7 points 5 months ago (1 children)

would benefit from delving (!) deeper in library science/archival science/philosophy and their application to history

Ooh, would you say more about this? I have opinions, but that’s because I’m a programmer now but formerly a librarian & archivist (on the digital side, it’s more common to go back and forth between them; it’s the same degree).

[–] [email protected] 7 points 5 months ago* (last edited 5 months ago)

I'm afraid my thoughts on the matter aren't that deep or well informed ^^.

In no particular order:

  • I grew up in France, and my (probably biased) view, it tends a bit more towards teaching "Literary" subjects, including for engineering students. I think in general this does indeed develop literacy and critical thinking.
  • France has "Professors Documentalist" and we call our school libraries "Center for Documentation and Information" from middle school up, with a few (very) introductory courses on using Thesaurus, Bibliography and digital index cards tools (this may of become enshittified by the availability of google since my time there)
  • I have a small Lexicography hobby.
  • I have a small reading old sources hobby.
  • I think more "Traditional" digital search is still incredibly valuable
  • I think principles predating the digital age are still incredibly valuable
  • The way STEM fields are taught is often focused on "one correct answer", and i don't remember that much focus being put on where the sources come from, comparing differing sources, or even any emphasis on how can be certain a given source has been accurately transmitted to the present age in history.
  • I think information retrieval is a vital skill (especially with the enshitification of google) that all fields when benefit practitionners from being more comfortable with (though of course it's still its own job).
  • I think software engineers in particular, during their education, would be well served by practical examples of reconciling conflicting or uncertain sources, and I think history is a good lens (less abstract vs software).

I'd be interested in your perspective!

[–] [email protected] 9 points 5 months ago (1 children)

I didn't get the vibe she agreed with it, I got the sense she was exasperated but practical about it. Her students are career driven, in a world that told them until two years ago that this expensive credentialing was the key to becoming silicon valley rich.

Separately, it's a well-established point of concern that a computer science degree is inapplicable to the work of the vast majority of people who become working, non-academic software engineers, and that while there are valuable things an academic program could teach pre-professional developers that too few engineers understand, that's not the focus of CS. The reality (in the US at least) is that a CS degree is sold as vocational program by the universities, and many jobs list a CS degree as a requirement or a desired skill. The author's students paid almost $7000 for her course alone. Whether those facts should be true is up for debate, but that's the reality in which the author is teaching.

The author is open that she became a programmer for financial stability, which is the world most of us live in. I enjoy writing code and being creative, but I work in software development to eat.

[–] [email protected] 8 points 5 months ago* (last edited 5 months ago) (1 children)

The reality (in the US at least) is that a CS degree is sold as vocational program by the universities, and many jobs list a CS degree as a requirement or a desired skill. The author’s students paid almost $7000 for her course alone.

Well, it's very hard for me to have a discussion about philosophical merits of education when the context is the USA where education is so fundamentally fucked. It might as well be that the best course of action for the well-being of students is to make sure they at least get bang for their buck, but that's a systemic problem one level below what I'm talking about even. I don't want to discount this as a reality for actual people on the ground - I think then the correct position is not my waxing philosophical about contents of courses, but rather nailing everyone against free public education in the US government to a fucking wall.

and many jobs list a CS degree as a requirement or a desired skill

This is, I think, a symptom of this push-and-pull between industry and academia. The industry would want to have a CS degree mean that they're getting engineers ready to patch up their legacy code, because they would much rather have the state (or the students themselves in the USA case) pay for that training than having to train their employees themselves. But I suggest that the correct default response to industry's wants is "NO." unless they have some really good points. Google can pay for their employees to learn C++, but they won't pay a dime to teach you something they don't need for their profit margins. Which is precisely the point of public education, teaching you stuff because it's philosophically justified to have a population that knows things, not because they lead to $$$.

[–] [email protected] 5 points 5 months ago* (last edited 5 months ago)

Yeah, that’s a huge problem with private education. If it’s expensive to the student, they want a profit. If the uni is expensive to run and privately funded, they want rich alumni. (And sadly, even in public universities in the US, the funders have a horrifically profit motivated view: the purpose of public education is to produce a highly trained body of workers. The crisis in American higher ed is deep right now; lawmakers and academic administrators fundamentally don’t believe in the humanities.)

Still, part of this is CS’s fault as a field. You mentioned to David the difference between engineering and physics, and in most places, those are different academic fields of study. Both valuable, but different. Why shouldn’t CS do the same?

I’ve found that most of the best working application programmers I’ve worked with have a liberal arts background with a humanities focus, because the training leads to a more holistic view of complex systems, and a better ability to work with potential user needs, and for programming closer to the user in a chaotic system, that can be more useful than understanding NP completeness and context free grammars.

Tl;dr I think we’re violently agreeing with one another. US universities shouldn’t be so aggressively focused on turning out graduates who will become productive, rich worker bees, and using an academic field of study to do so is corrupting the academic field & not ideal for the students.

[–] [email protected] 7 points 5 months ago* (last edited 5 months ago)

From the pov of a slightly exhausted prof who just wants a short-ish answer for her students, the conclusion sorta makes sense, I guess. The students want to convince themselves they aren't wasting their time with genAI and she's not in a position to convince them otherwise, so the next best thing is showing them what industrial life with genAI will be like.

"The future you're dreaming of sucks, so get used to it." isn't a satisfying answer, but its a forced perspective.

[–] [email protected] 5 points 5 months ago

The artlicle certainly feels blasé ^^, I think the most objectional part is:

Large language models shift even more of that time into investigation, because the moment the team gets a chance to build, they turn around and ask ChatGPT (or Copilot, or Devin, or Gemini) to do it. When we learn that we need to integrate with google cloud storage, or spaCy, or SQS Queue, or Firebase? Same thing: turn around and ask the LLM to draft the integration.

Now clearly (to me) the author isn't happy about this, but I think they are giving hope on the direction of the profession too soon. There are still plenty of people happy enough to implement things themselves.

[–] [email protected] 9 points 5 months ago (1 children)

This is a good piece, both on the gap between how gen AI is sold and what it does, and on the reality of what professional programming is.

[–] dohpaz42 9 points 5 months ago (2 children)

… by the time you’ve spent four hours tearing your hair out, … the code … to fix your problem is one, single line.

This, I feel, sums up the reality of professional programming in a nutshell. 🤣

[–] [email protected] 14 points 5 months ago* (last edited 5 months ago)

OK sorry this is rambly but I gotta get these programmer feelings off my chest.... If anything 4 hours is an understatement.


Back in university I once spent an entire week tracking down a latent bug in my program after the professor changed the project requirements a week before the due date. It was an accidental use of =instead of a copy in Java. We're talking every waking moment both in and out of class (I was not the best at debugging back then...).

Now in the working world there's bugs-- but they're not just my bugs anymore. Rather there's decades of bugs piled on top of bugs. Code has dozens of authors, most of whom quit long ago. The ones that remain often have no memory of the code.

Just last week I did a code review of a co-workers bugfix for a bug introduced in 2008. The fix was non-trivial due to:

  1. The code being a tangled mass of overlapping state and (more importantly)
  2. No one actually remembering anything about the code or where it is called or why it is there in the first place or what the implications of changing it are. Except that it's causing problems (An O(n^2) slowdown case harming production) now in 2024.
  3. The original design doc was in the personal folder of the original author (no longer at the company), which was garbage collected years ago.

So reviewing the code involved comparing every iteration of the code, from the initial commit, up to where the bug was introduced, up to the state it was in today before my coworkers fix, and my coworkers fix. It turns out he got it wrong, and I can't exactly blame him because there is no right in this sort of environment. Fortunately the wrongness was caught by me and whatever meager unit-tests were written for it.

This all took maybe half a day for me, and a day for my coworker, for 1.5 days of work between the two of us. All to fix a condition which was accidentally negated from what it should have been.


And this is indeed what LLM for code enthusiasts miss.

Even if the LLM saves some time with writing boilerplate code, it'll inevitably mess up in subtle ways, or programmers will think the LLM can do more than they actually can. Either way they'll end up introducing subtle bugs; so you have a situation where someone saving 20 seconds here or there leads to hours of debugging effort, or worse, at an unpredictable point in the future.

At least with human written code you can go back and ask them what they were thinking, or read the design doc, or read comments and discussion. Even the most amateurish human author code has the spark of life to it. It was in essense a manifestation of someone's wish.

On the other hand with code that's just statistical noise there's no way to tell what it was trying to do in the first place. There is no will / soul / ego in the code, so there is no understanding, so there is no way to debug it short of reverting the whole change and starting over.

[–] [email protected] 11 points 5 months ago* (last edited 5 months ago)

I wish it were always that easy, few things in legacy code maintenance brings me more joy than deleting a single line of code, the solution is sadly often more involved.

The reality is sometimes more like fighting a hydra spaghetti ball, where felling one bug, uncovers/spawns two more.