this post was submitted on 25 Feb 2024
203 points (83.7% liked)

Technology

60131 readers
3795 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path::Don't learn to code advises Jensen Huang of Nvidia. Thanks to AI everybody will soon become a capable programmer simply using human language.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 258 points 10 months ago (16 children)

Founder of company which makes major revenue by selling GPUs for machine learning says machine learning is good.

load more comments (16 replies)
[–] muntedcrocodile 63 points 10 months ago (2 children)

I worry for the future generations that cant debug cos they dont know how to program and just use ai.

[–] NocturnalEngineer 18 points 10 months ago

Don't worry, they'll have AI animated stick figures telling them what to do instead...

load more comments (1 replies)
[–] ThePowerOfGeek 58 points 10 months ago (5 children)

Having used Chat GPT to try to find solutions to software development challenges, I don't think programmers will be at that much risk from AI for at least a decade.

Generative AI is great at many things, including assistance with basic software development tasks (like spinning up blueprints for unit tests). And it can be helpful filling in code gaps when provided with a very specific prompt... sometimes. But it is not great at figuring out the nuances of even mildly complex business logic.

[–] DacoTaco 23 points 10 months ago* (last edited 10 months ago) (1 children)

This.
I got a github copilot subscription at work and its useful for suggesting code in small parts, but i would never let it decide what design pattern to use to tackle the problem we are solving. Once i know the solution i can use ai, and verify its output to use in the code

load more comments (1 replies)
load more comments (4 replies)
[–] fidodo 58 points 10 months ago (2 children)

As a developer building on top of LLMs, my advice is to learn programming architecture. There's a shit ton of work that needs to be done to get this unpredictable non deterministic tech to work safely and accurately. This is like saying get out of tech right before the Internet boom. The hardest part of programming isn't writing low level functions, it's architecting complex systems while keeping them robust, maintainable, and expandable. By the time an AI can do that, all office jobs are obsolete. AIs will be able to replace CEOs before they can replace system architects. Programmers won't go away, they'll just have less busywork to do and instead need to work at a higher level, but the complexity of those higher level requirements are about to explode and we will need LLMs to do the simpler tasks with our oversight to make sure it gets integrated correctly.

I also recommend still learning the fundamentals, just maybe not as deeply as you needed to. Knowing how things work under the hood still helps immensely with debugging and creating better more efficient architectures even at a high level.

I will say, I do know developers that specialized in algorithms who are feeling pretty lost right now, but they're perfectly capable of adapting their skills to the new paradigm, their issue is more of a personal issue of deciding what they want to do since they were passionate about algorithms.

load more comments (2 replies)
[–] Wooki 53 points 10 months ago* (last edited 10 months ago) (2 children)

This overglorified snake oil salesman is scared.

Anyone who understands how these models works can see plain as day we have reached peak LLM. Its enshitifying on itself and we are seeing its decline in real time with quality of generated content. Dont believe me? Go follow some senior engineers.

[–] [email protected] 20 points 10 months ago (2 children)

Any recommendations whom to follow? On Mastodon?

[–] thirteene 16 points 10 months ago (5 children)

There is a reason they didn't offer specific examples. LLM can still scale by size, logical optimization, training optimization, and more importantly integration. The current implementation is reaching it's limits but pace of growth is also happening very quickly. AI reduces workload, but it is likely going to require designers and validators for a long time.

load more comments (5 replies)
load more comments (1 replies)
[–] Animated_beans 12 points 10 months ago (1 children)

Why do you think we've reached peak LLM? There are so many areas with room for improvement

load more comments (1 replies)
[–] [email protected] 51 points 10 months ago

Lmao do the opposite of whatever this guy says, he only wants his 2 trillion dollar stockmarket bubble not to burst

[–] kescusay 48 points 10 months ago (1 children)

Well. That's stupid.

Large language models are amazingly useful coding tools. They help developers write code more quickly.

They are nowhere near being able to actually replace developers. They can't know when their code doesn't make sense (which is frequently). They can't know where to integrate new code into an existing application. They can't debug themselves.

Try to replace developers with an MBA using a large language model AI, and once the MBA fails, you'll be hiring developers again - if your business still exists.

Every few years, something comes along that makes bean counters who are desperate to cut costs, and scammers who are desperate for a few bucks, declare that programming is over. Code will self-write! No-code editors will replace developers! LLMs can do it all!

No. No, they can't. They're just another tool in the developer toolbox.

[–] paf0 12 points 10 months ago (8 children)

I've been a developer for over 20 years and when I see Autogen generate code, decide to execute that code and then fix errors by making a decision to install dependencies, I can tell you I'm concerned. LLMs are a tool, but a tool that might evolve to replace us. I expect a lot of software roles in ten years to look more like an MBA that has the ability to orchestrate AI agents to complete a task. Coding skills will still matter, but not as much as soft skills will.

load more comments (8 replies)
[–] eager_eagle 48 points 10 months ago* (last edited 10 months ago) (2 children)

the day programming is fully automated, so will other jobs.

maybe it'd make more sense if he suggested to be a blue collar worker instead.

[–] Ghostalmedia 20 points 10 months ago (4 children)

Human can probably still look forward to back breaking careers of manual labor that consist of complex varied movements!

load more comments (4 replies)
load more comments (1 replies)
[–] filister 47 points 10 months ago* (last edited 10 months ago) (5 children)

You remember when everyone was predicting that we are a couple of years away from fully self-driving cars. I think we are now a full decade after those couple of years and I don't see any fully self driving car on the road taking over human drivers.

We are now at the honeymoon of the AI and I can only assume that there would be a huge downward correction of some AI stocks who are overvalued and overhyped, like NVIDIA. They are like crypto stock, now on the moon tomorrow, back to Earth.

[–] SlopppyEngineer 18 points 10 months ago* (last edited 10 months ago)

Two decades. DARPA Grand Challenge was in 2004.

Yeah, everybody always forgets the hype cycle and the peak of inflated expectations.

[–] paf0 13 points 10 months ago (1 children)

Waymo exists and is now moving passengers around in three major cities. It's not taking over yet, but it's here and growing.The timeframe didn't meet the hype but the technology is there.

[–] filister 17 points 10 months ago* (last edited 10 months ago)

Yes, the technology is there but it is not Level 5, it is 3.5-4 at best.

The point with a full self-driving car is that complexity increases exponentially once you reach 98-99% and the last 1-2% are extremely difficult to crack, because there are so many corner cases and cases you can't really predict and you need to make a car that drives safer than humans if you really want to commercialize this service.

Same with generative AI, the leap at first was huge, but comparing GPT 3.5 to 4 or even 3 to 4 wasn't so great. And I can only assume that from now on achieving progress will get exponentially harder and it will require usage of different yet unknown algorithms and models and advances will be a lot more modest.

And I don't know for you but ChatGPT isn't 100% correct especially when asking more niche questions or sending more complex queries and often it hallucinates and sometimes those hallucinations sound extremely plausible.

load more comments (3 replies)
[–] madcaesar 37 points 10 months ago

This seems as wise as Bill Gates claiming 4MB of ram is all you'll ever need back on 98 🙄

[–] gornius 37 points 10 months ago (3 children)

It's just as crazy as saying "We don't need math, because every problem can be described using human language".

In other words, that might be true as long as your problem is not complex enough to be able to be understood using human language.

You want to solve a real problem? It's way more complex with so many moving parts you can't just take LLM to solve it, because that takes an actual understanding of a problem.

[–] Fandangalo 11 points 10 months ago (1 children)

Maybe more apt for me would be, “We don’t need to teach math, because we have calculators.” Like…yeah, maybe a lot of people won’t need the vast amount of domain knowledge that exists in programming, but all this stuff originates from human knowledge. If it breaks, what do you do then?

I think someone else in the thread said good programming is about the architecture (maintainable, scalable, robust, secure). Many LLMs are legit black boxes, and it takes humans to understand what’s coming out, why, is it valid.

Even if we have a fancy calculator doing things, there still needs to be people who do math and can check. I’ve worked more with analytics than LLMs, and more times than I can count, the data was bad. You have to validate before everything else, otherwise garbage in, garbage out.

It’s sounds like a poignant quote, but it also feels superficial. Like, something a smart person would say to a crowd to make them say, “Ahh!” but also doesn’t hold water long.

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 34 points 10 months ago (2 children)

I don't think he's seen the absolute fucking drivel that most developers have been given as software specs before now.

Most people don't even know what they want, let alone be able to describe it. I've often been given a mountain of stuff, only to go back and forth with the customer to figure out what problem they're actually trying to solve, and then do it in like 3 lines of code in a way that doesn't break everything else, or tie a maintenance albatross around my neck for the next ten years.

load more comments (2 replies)
[–] [email protected] 34 points 10 months ago (5 children)

I don't see how it would be possible to completely replace programmers. The reason we have programming languages instead of using natural language is that the latter has ambiguities. If you start having to describe your software's behaviour in natural language, then one of three things can happen:

  1. either this new natural programming language has to make assumptions about what you intend, and thus will only be capable of outputting a certain class of software (i.e. you can't actually create anything new),
  2. or you need to learn a new way of describing things unambiguously, and now you're back to programming but with a new language,
  3. or you spend forever going back and forth with the generator until it gives you the output you want, and this would take a lot longer to do than just having an experienced programmer write it.
[–] ReplicaFox 16 points 10 months ago

And if you don't know how to code, how do you even know if it gave you the output you want until it fails in production?

load more comments (4 replies)
[–] [email protected] 31 points 10 months ago (2 children)

I think this is bullshit regarding LLMs, but making and using generative tools more and more high-level and understandable for users is a good thing.

Like various visual programming means, where you sketch something working via connected blocks (like PureData for sounds), or in Matlab I think one can use such constructors to generate code for specific controllers involved in the scheme, or like LabView.

Or like HyperCard.

Not that anybody should stop learning anything. There's a niche for every way to do things.

I just like that class of programs.

load more comments (2 replies)
[–] [email protected] 27 points 10 months ago (1 children)

Jensen fucking Huang is a piece of shit and choke full of it too

Actually, AI can replace this dick at a fraction of the cost instead of replacing developers. Bring out the guillotine mfs

[–] gaifux 19 points 10 months ago (1 children)

Your vulgarity and call to violence are quite convincing, sir. Mayhaps you moonlight as a bard?

load more comments (1 replies)
[–] Evotech 25 points 10 months ago (1 children)
[–] SuckMyWang 16 points 10 months ago

Why would he lie? Other than to pump the companies shares

[–] [email protected] 24 points 10 months ago

I can kind of see his point, but the things he is suggesting instead (biology, chemistry, finance) don't make sense for several reasons.

Besides the obvious "why couldn't AI just replace those people too" (even though it may take an extra few years), there is also a question of how many people can actually have a deep enough expertise to make meaningful contributions there - if we're talking about a massive increase of the amount of people going into those fields.

[–] JeeBaiChow 22 points 10 months ago

I mean why have a CS degree when an AI subscription costs $30/month?

/s

[–] [email protected] 16 points 10 months ago

After using co pilot and other AI code tools it's obvious to see the limitations of it, programming is a lot more than just writing "ok" code

[–] RagingSnarkasm 14 points 10 months ago (1 children)

There’s good money to be made in selling leather jackets.

load more comments (1 replies)
[–] [email protected] 14 points 10 months ago (1 children)

I think the Jensen quote loosley implies we don't need to learn a programming language but the logic was flimsy. Same goes for the author as they backtrack a few times. Not a great article in my opinion.

[–] [email protected] 24 points 10 months ago* (last edited 10 months ago)

Jensen's just trying to ride the AI bubble as far as it'll go, next he'll tell you to forget about driving or studying

[–] Modern_medicine_isnt 12 points 10 months ago (1 children)

It's not really about the coding, it's about the process of solving the problem. And ai is very far away from being able to do that. The language you learn to code in is probably not the one you will use much of you life. It will just get replaced by which ai you will use to code.

[–] Dkarma 13 points 10 months ago

Yep. The best guy on my team isn't the best coder. He's the best at visualizing the complete solution and seeing pinch points in his head.

[–] 3volver 11 points 10 months ago

Don't tell me what to do. Going to spend more time learning to code from now on, thanks.

load more comments
view more: next ›