this post was submitted on 11 Apr 2024
1226 points (95.5% liked)

Science Memes

11189 readers
4971 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] NounsAndWords 102 points 7 months ago (12 children)

AI is going to destroy art the same way Photoshop, or photography, or pre-made tubes of paints, destroyed art. It's a tool, it helps people take the idea in their head and put it in the world. And it lowers the barrier to entry, now you don't need years of practice in drawing technique to bring your ideas to life, you just need ideas.

If AI gets to a point that it can give us creative, original, art that sparks emotion in novel ways...well we probably also made a super intelligent AI and our list of problems is much different than today.

[–] [email protected] 42 points 7 months ago (3 children)

As someone who's absolutely terrible at drawing, but enjoys photography and generally creativity, having AI tools to generate my own art is opening up a whole different avenue for me to scratch my creative itch.
I've got a technical background, so figuring out the tools and modifying them for my purposes has been a lot more fun than practice drawing.

[–] Potatos_are_not_friends 18 points 7 months ago

This is the perfect use case.

Photoshop didn't destroy jobs forever, all it did was shift how people worked AND actually created work and different types of work.

[–] [email protected] 13 points 7 months ago (1 children)

I've only dabbled a bit with ML art, and I am by no means an artist, but it doesn't scratch that itch for me the same way that drawing or doing stuff in blender does. It doesn't really feel like I'm watching my vision slowly take shape, no matter how precise I make the prompt. It kinda just feels like what it is, a transformer iterating over some random noise.

I'm also a very technical person, and for years I was stuck in that same mindset of "I'm a technical guy, I'm not cut out for art". I was only able to get out of this slump thanks to some of my art friends, who were really helpful in pointing me in the right direction.

Learning to draw isn't the easiest thing in the world, and trust me I'm probably as bad at it as you are, but it's fun, and it feels satisfying.

I agree that AI has a place as another artistic medium, but I also feel like it can become a trap for people like me who think they don't have an artistic bone in their body.

If you do feel like getting back into drawing, then as a fellow technical person I'd recommend learning blender first. It taught me some of the skills I also use in drawing, like perspective, shading, and splitting complex objects into simpler shapes. It's also just plain fun.

load more comments (1 replies)
[–] Gabu 11 points 7 months ago (3 children)

As someone who’s absolutely terrible at drawing

Then practice. Nearly no artist was born knowing how to draw or paint, we dedicated countless hours to learn what works and what doesn't.

[–] disguy_ovahea 13 points 7 months ago

As a musician, I couldn’t agree more. Talent really helps with initial aptitude, but will peter out when challenged. That’s when real skill development begins. Time and investment connecting you to your craft until there’s nothing in the world between the two, that’s self actualization.

[–] [email protected] 9 points 7 months ago

But that's not fun for them. You get really good at things you like to do.

load more comments (1 replies)
[–] braxy29 13 points 7 months ago

i like the idea of AI as a tool artists can use, but that's not a capitalist's viewpoint, unfortunately. they will try to replace people.

load more comments (10 replies)
[–] [email protected] 74 points 7 months ago (3 children)

Tech bros are not really techie themselves as they are really just Wall Street bros with tech as their product. Most claim they can code, but if they were coders they would be coding. They are not coders, they are businessmen through and through.who just happen to sell tech.

[–] [email protected] 28 points 7 months ago

This is 100% correct. It can overlap but honestly as someone going into embedded systems I despise tech bros.

[–] [email protected] 22 points 7 months ago (1 children)

Most claim they can code, but if they were coders they would be coding

I dislike techbros as much as you, but this isn't really a valid statement.

I can code, but I can't sell a crypto scam to millions of rubes.

If I could, why would I waste my time writing code?

Many techbros are likely "good enough" coders who have better marketing skills and used their tech knowledge to leverage into business instead.

[–] [email protected] 18 points 7 months ago

That is the thing though. The real talented tech people tend to be more in the weeds of the tech and get great enjoyment from that. The “tech bros” are more into groups, people, social structures, manipulation, controlling and such and would go crossed eyed if they really had to code something complex as they could never sit that long and concentrate. These are not these same people. Tech bros want you to think they are tech gurus as that is their brand, but it is a lie.

load more comments (1 replies)
[–] rustyfish 41 points 7 months ago (14 children)

I think approximation is the right word here. It’s pretty cool and all and I’m looking forward how it will develop. But it’s mostly a fun toy.

I’m stoked for the moment the tech bros understand, that an AI is way better at doing their job than it is at creating art.

[–] [email protected] 16 points 7 months ago

tech bros jobs is to wrote bad javascript and fall for scam, this AI already beaten

[–] [email protected] 10 points 7 months ago

I think one thing you and many other people misunderstand is that the image generation aspect of AI is a sideshow, both in use and in intent.

The ability to generate images from text based prompts is basically a side effect of the ability that they are actually spending billions on, which is object detection.

[–] [email protected] 8 points 7 months ago (1 children)

It's bad at anything useful for programming too.

load more comments (1 replies)
load more comments (11 replies)
[–] EnderMB 40 points 7 months ago (7 children)

I work in AI. LLM's are cool and all, but I think it's all mostly hype at this stage. While some jobs will be lost (voice work, content creation) my true belief is that we'll see two increases:

  1. The release of productivity tools that use LLM's to help automate or guide menial tasks.

  2. The failure of businesses that try to replicate skilled labour using AI.

In order to stop point two, I would love to see people and lawmakers really crack down on AI replacing jobs, and regulating the process of replacing job roles with AI until they can sufficiently replace a person. If, for example, someone cracks self-driving vehicles then it should be the responsibility of owning companies and the government to provide training and compensation to allow everyone being "replaced" to find new work. This isn't just to stop people from suffering, but to stop the idiot companies that'll sack their entire HR department, automate it via AI, and then get sued into oblivion because it discriminated against someone.

[–] Donkter 12 points 7 months ago (2 children)

I've also heard it's true that as far as we can figure, we've basically reached the limit on certain aspects of LLMs already. Basically, LLMs need a FUCK ton of data to be good. And we've already pumped them full of the entire internet so all we can do now is marginally improve these algorithms that we barely understand how they work. Think about that, the entire Internet isnt enough to successfully train LLMs.

LLMs have taken some jobs already (like audio transcription, basic copyediting, and aspects of programming), we're just waiting for the industries to catch up. But we'll need to wait for a paradigm shift before they start producing pictures and books or doing complex technical jobs with few enough hallucinations that we can successfully replace people.

[–] prime_number_314159 9 points 7 months ago (1 children)

The (really, really, really) big problem with the internet is that so much of it is garbage data. The number of false and misleading claims spread endlessly on the internet is huge. To rule those beliefs out of the data set, you need something that can grasp the nuances of published, peer-reviewed data that is deliberately misleading propaganda, and fringe conspiracy nuts that believe the Earth is controlled by lizards with planes, and only a spritz bottle full of vinegar can defeat them, and everything in between.

There is no person, book, journal, website, newspaper, university, or government that has reliably produced good, consistent help on questions of science, religion, popular lies, unpopular truths, programming, human behavior, economic models, and many, many other things that continuously have an influence on our understanding of the world.

We can't build an LLM that won't consistently be wrong until we can stop being consistently wrong.

load more comments (1 replies)
[–] EnderMB 8 points 7 months ago (1 children)

My own personal belief is very close to what you've said. It's a technology that isn't new, but had been assumed to not be as good as compositional models because it would cost a fuck-ton to build and would result in dangerous hallucinations. It turns out that both are still true, but people don't particularly care. I also believe that one of the reasons why ChatGPT has performed so well compared to other LLM initiatives is because there is a huge amount of stolen data that would get OpenAI in a LOT of trouble.

IMO, the real breakthroughs will be in academia. Now that LLM's are popular again, we'll see more research into how they can be better utilised.

load more comments (1 replies)
[–] [email protected] 7 points 7 months ago (2 children)

I sincerely doubt AI voice over will out perform human actors in the next 100 years in any metric, including cost or time savings.

load more comments (2 replies)
[–] [email protected] 7 points 7 months ago

Nah fuck HR, they're the shield of the companies to discriminate withing margins from behind

I think the proper route is a labor replacement tax to fund retraining and replacement pensions

load more comments (4 replies)
[–] [email protected] 40 points 7 months ago (10 children)

There are plenty of things you can shit on AI art for

But it is neither badly approximately, nor can a student produce such work in less than a minute.

This feels like the other end of the extreme of the tech bros

[–] [email protected] 13 points 7 months ago

To me, this feels similar to when photography became a thing.

Realism paintings took a dive. Did photos capture realism? Yes. Did it take the same amount of time and training? Hell no.

I think it will come down to what the specific consumer wants. If you want fast, you use AI. If you want the human-made aspect, you go with a manual artist. Do you prefer fast turnover, or do you prefer sentiment and effort? Do you prefer pieces from people who master their craft, or from AI?

I'm not even sorry about this. They are not the exact same, and I'm sick of people saying that AI are and handcrafted art are the exact same. Even if you argue that it takes time to finesse prompts, I can practically promise you that the amount of time between being able to create the two art methods will be drastic. Both may have their place, but they will never be the exact same.

It's the difference between a hand-knitted sweater from someone who had done it their entire life to a sweater from Walmart. It's a hand crafted table from an expert vs something you get from ikea.

Yes, both fill the boxes, but they are still not the exact same product. They each have their place.

On the other hand, I won't commend the hours required to master the method as if they're the same. AI also usually doesn't have to factor in materials, training, hourly rate, etc.

load more comments (9 replies)
[–] [email protected] 36 points 7 months ago (2 children)

Art itself isn't useless it's just incredibly replicable. There is so much good art out there that people don't need to consume crap.

It's like saying there is no money in being a footballer. Of course there is loads of money in being a footballer. But most people that play football don't make any money.

load more comments (2 replies)
[–] SanndyTheManndy 29 points 7 months ago (7 children)

Billions were spent inventing and producing the calculator device.

Human calculators are now extinct.

Complex calculations are far more accessible.

load more comments (7 replies)
[–] [email protected] 25 points 7 months ago (10 children)

Turing Incompleteness is a pathway to many powers the Computer Scientists would consider incalculable.

[–] [email protected] 8 points 7 months ago (2 children)

Is it possible to learn this power?

load more comments (2 replies)
load more comments (9 replies)
[–] [email protected] 22 points 7 months ago

I just love the idjits who think not showing empathy to people AI bros are trying to put out of work will save them when the algorithms come for their jobs next

When LeopardsEatingFaces becomes your economic philosophy

[–] Gabu 19 points 7 months ago (31 children)

That's a pretty shit take. Humankind spent nearly 12 thousand years figuring out the combustion engine. It took 1 million years to figure farming. Compared to that, less than 500 years to create general intelligence will be a blip in time.

[–] braxy29 42 points 7 months ago (9 children)

i think you're missing the point, which i took as this - what arts and humanities folks do is valuable (as evidenced by efforts to recreate it) despite common narratives to the contrary.

load more comments (9 replies)
[–] [email protected] 12 points 7 months ago (1 children)

Really only around 80 years between the first machines we'd consider computers and today's LLMs, so I'd say that's pretty damn impressive

load more comments (1 replies)
load more comments (29 replies)
[–] [email protected] 18 points 7 months ago* (last edited 7 months ago) (2 children)

I propose that we treat AI as ancillas, companions, muses, or partners in creation and understanding our place in the cosmos.

While there are pitfalls in treating the current generation of LLMs and GANs as sentient, or any AI for that matter, there will be one day where we must admit that an artificial intelligence is self-aware and sentient, practically speaking.

To me, the fundamental question about AI, that will reveal much about humanity, is philosophical as much as it is technical: if a being that is artificially created, has intelligence, and is functionally self-aware and sentient, does it have natural rights?

load more comments (2 replies)
[–] thedeadwalking4242 17 points 7 months ago (21 children)

Honestly people are trying to desperately to automate physical labor to. The problem is the machines don't understand the context of their work which can cause problems. All the work of AI is a result of trying to make a machine that can. The art and humanities is more a side project

load more comments (21 replies)
[–] [email protected] 14 points 7 months ago* (last edited 7 months ago) (4 children)

they're misunderstanding the reasoning for spending billions.

the reason to spend all the money to approximate is so we can remove arts and humanities majors altogether.. after enough approximation yield similar results to present day chess programs which regularly now beat humans and grand masters. their vocation is doomed to the niche, like most of humanity, eventually.

[–] [email protected] 25 points 7 months ago (4 children)

Imagine seeing writing and art as purely functional activities.

load more comments (4 replies)
load more comments (3 replies)
[–] [email protected] 13 points 7 months ago* (last edited 7 months ago)

Matthew Dow Smith, whomever the fuck that is, has a sophisticated delusion about what's actually going on and he's incorporated it into his persecution complex. Not impressed.

load more comments
view more: next ›