this post was submitted on 11 Oct 2023
506 points (92.6% liked)

Technology

59213 readers
2517 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MargotRobbie 65 points 1 year ago* (last edited 1 year ago) (4 children)

Oh surprise surprise, looks like generative AI isn't going to fulfill Silicon Valley and Hollywood studios' dream of replacing artist, writers, and programmers with computer to maximize value for the poor, poor shareholders. Oh no!

As I said here before, generative AIs are not universal solution to everything that has ever existed like they are hyped up to be, but neither are they useless. At the end of the day, they are ultimately tools. Complex, powerful, useful tools, but tools nonetheless. A good artist can create better work faster with the help of a diffusion model, the same way LLM code generation can help a good programmer finish their project faster and better. (I think). All of these AI models are trained on data from data from everyone on Internet, which is why I think its reasonable that everyone should have access to these generative AI models for the benefit of humanity and not profit, and not just those who took other people's work for free to trained the models. In other words, these generative AI models should belong to everyone.

And here lies my distaste for Sam Altman: OpenAI was founded as a nonprofit for the benefit of humanity, but at the first chance of money he immediately started venture capitalisting and put anything from GPT-2 onwards under locks and keys for money, and now it looks like that they are being crushed under the weight of their own operating costs while groups like Facebook and Stability catches up with actual open models, I will not be sad if "Open"AI fails.

(For as much crap as I give Zuck for the other awful things they do, I do admire their commitment to open source.)

I have to admit, playing with these generative models is pretty fun.

[–] [email protected] 15 points 1 year ago (1 children)

Hm. I think you should zoom out a bit and try to recognize that AI isn't stagnant.

Voice recognition and translation programs to years before they were appropriate for real-world applications. AI is also going to require years before it's ready. But that time is coming. We haven't reached a 'ceiling' for AI's capabilities.

[–] MargotRobbie 10 points 1 year ago (1 children)

Breakthrough technological development usually can be described as a sigmoid function (s-shaped curve), while there is an exponential progress in the beginning, it usually hit a climax then slow down and plateau until the next breakthrough.

There are certain problem that are not possible to resolve with the current level of technology for which development progress has slowed to a crawl, such as level 5 autonomous driving (by the way, better public transport is a way less complex solution.), and I think we are hitting the limit of what far transformer based generative AI can do since training has become more and more expensive for smaller and smaller gains, whereas hallucination seems to be an inherent problem that is ultimately unfixable with the current level of technology.

[–] [email protected] 3 points 1 year ago (1 children)

One thing that I think makes AI a possibility to deviate from that S model is that it can be honed against itself to magnify improvements. The better it gets the better the next gen can get.

[–] [email protected] 9 points 1 year ago (1 children)

that is a studied, documented, surefire way to very quickly destroy your model. It just does not work that way. If you train an llm on the output of another llm (or itself) it will implode.

[–] [email protected] 3 points 1 year ago

Also at best it's an refinement, not a new sigmoid. So are new hardware/software designs for even faster dot products or advancements in network topology within the current framework. T3 networks would be a new sigmoid but so far all we know is why our stuff fundamentally doesn't scale to the realm of AGI, and the wider industry (and even much of AI research going on in practice) absolutely doesn't care as there's still refinements to be had on the current sigmoid.

[–] batmangrundies 10 points 1 year ago* (last edited 1 year ago) (1 children)

There was a smallish VFX group here that was attached to a volume screen company. They employed something like 20 people I think? So pretty small.

But the volume screen employed a guy who could do an adequate enough job with generative tools instead and the company folded. The larger VFX company they partner with had 200 employees, they recently cut to 50.

In my field, a team leader in 2018 could earn about 180,000 AUD P/A. Now those jobs are advertised for 130,000 AUD, because new models can do ~80% of the analysis with human accuracy.

AI is already folding companies and cutting jobs. It's not in the news maybe, but as industries shift to compete with smaller firms leveraging AI it will cascade.

I had/have my own company, we were attached to Metropolis which unfortunately folded. I think that had a role to play in the job cuts as well. Luckily for me I wasn't overleveraged, but I am packing up and changing careers for sure.

[–] MargotRobbie 10 points 1 year ago (1 children)

Generative AI can make each individual artist/writer/programmer much more efficient at their job, but the shareholders and executives get their way and only big companies have access to this technologu, this increased productivity will instead be used reduce headcount and make the remaining people do more work on a tighter deadline, instead of helping everyone work less, do better work, and be happier.

This is the reason I think democratizing generative AI via local models is important, because as your example shows, it levels the playing field between small and big players, and helps people work less while making more cool stuff.

[–] batmangrundies 8 points 1 year ago (3 children)

A big problem in Aus is the industry culture. They don't care about using technology to improve results. They only care about cutting costs, even if the final product doesn't meet the previous standard.

And we've seen that with VFX across the globe, the overall quality dropped drastically. Because studios play silly buggers to weasel out of paying VFX companies what they are due.

From what I hear, even DNEG is in trouble, and were even before the strike.

It's a race to the bottom it seems.

My honest hope for the film industry is likely the same as yours. That we have smaller productions with access to better post due to improvements in AI-driven compositing software and so on.

But it's likely that a role that was earning $$$ before is devalued significantly. And while I'm an unabashed anti-capitalist, I think a lot of folks misunderstand what this sudden downward pressure on income can do. Cost of living increasing while wages shrink is an awful combination

I'm 35, left a six figure job, folding my company and starting an electrician's apprenticeship. To give you an idea around what my views about AI are. And of course this is as an Australian. We have a garbage white collar work culture anyway.

I think there will be a net improvement. But I worry that others will fail to adapt quickly. Too many are writing off AI as this thing that already came and went, but the tools have just landed, and we don't yet have workflows that correctly implement and leverage these yet.

[–] [email protected] 7 points 1 year ago

This is exactly why the SAG-AFTRA and WGA strikes have been vitally important, I think. Without pressure on industry, as we've seen across the board in the US for the last near half-century, fewer and fewer things that should improve lives are allowed to do so.

[–] AdrianTheFrog 3 points 1 year ago

It's crazy that with current economic systems, tools that make people work more efficiently have such a negative impact on society.

[–] [email protected] 9 points 1 year ago

Oh surprise surprise, looks like generative AI isn't going to fulfill Silicon Valley and Hollywood studios' dream of replacing artist, writers, and programmers with computer to maximize value for the poor, poor shareholders. Oh no!

It really is incredible how much this rhymes with the crypto hype. To be fair, the technology does actually have uses but, as someone in the latter category, after I saw it in action, I quickly felt less worried about my job prospects.

Fortunately, enough people in charge of staffing seem to have listened to people with technical knowledge to not make my earlier prediction (mass layoffs directly due to LLMs, followed by mass, panicked re-hirings when said LLMs ruined the business) come true. But, the worry itself, along with the RTO pushes (not to mention exploitation of contractors and H1B holders) really underscore his desperately the industry needs to get organized. Hopefully, what's going on in the games industry with IATSE gets more traction and more of my colleagues on the same page but, that's one area where I'm not as optimistic as I'd like to be - I'll just have to cheer on SAG, WGA, and UAW for the time being.

(For as much crap as I give Zuck for the other awful things they do, I do admire their commitment to open source.)

Absolutely agreed. There's a surprising amount of good in the open source world that has come from otherwise ethically devoid companies. Even Intuit donated the Argo project, which has evolved from a cool workflow tool to a toolkit with far more. There is always the danger of EEE, however, so, we've got to stay vigilant.