this post was submitted on 02 Jan 2025
14 points (67.5% liked)

Asklemmy

44265 readers
1404 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

First and foremost, this is not about AI/ML research, only about usage in generating content that you would potentially consume.

I personally won't mind automated content if/when that reach current human generated content quality. Some of them probably even achievable not in very distant future, such as narrating audiobook (though it is nowhere near human quality right now). Or partially automating music/graphics (using gen AI) which we kind of accepted now. We don't complain about low effort minimal or AI generated thumbnail or stock photo, we usually do not care about artistic value of these either. But I'm highly skeptical that something of creative or insightful nature could be produced anytime soon and we have already developed good filter of slops in our brain just by dwelling on the 'net.

So what do you guys think?

Edit: Originally I made this question thinking only about quality aspect, but many responses do consider the ethical side as well. Cool :).

We had the derivative work model of many to one intellectual works (such as a DJ playing a collection of musics by other artists) that had a practical credit and compensation mechanism. With gen AI trained on unethically (and often illegally) sourced data we don't know what produce what and there's no practical way to credit or compensate the original authors.

So maybe reframe the question by saying if it is used non commercially or via some fair use mechanism, would you still reject content regardless of quality because it is AI generated? Or where is the boundary for that?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 22 points 5 days ago (2 children)

There are two core issues I have with AI generated content:

  1. Ownership - All the big players are using proprietary software, weights, models, training methods, and datasets to generate these models. On top of the lack of visibility, they have farmed millions of peoples data and content without their knowledge or consent. If it were up to me, all AI research and software would be 100% open source, public access, non-copyright. That includes all theoretical work in scientific publications, all code, all the datasets, the weights, the infrastructure and training methods, absolutely everything.

  2. Lowest common denominator - AI has unleashed the ability for individuals and organizations to produce extremely low effort content at volumes that haven't been seen before. I hate how search results are becoming totally poisoned by AI slop. You just get pages and pages of sites that abuse SEO to become the top search result and are nothing more than click-farms to generate ad revenue. This is a systemic issue that stems from several things, primarily Capitalism, but also the way we cater to powerful corpos that push this sludge onto us.

I have no issue with AI tools that are actually helpful in their context. For instance, animation software that uses AI to help generate intermediate frames from your initial drawings. Screen reader software that uses AI to help sight-impaired folks with more accurate text-to-speech. AI tools that help with code completion, or debugging.

These are all legitimate uses of the technology, but sadly, all of that is being overshadowed by mountains of sludge being shoved on us at every level. Because those implementations aren't going to make rich people even richer, they aren't going to temp investors to dump billions more into AI startups and corpo tech. Helping blind people and indie animation studios is boring and low-profit, therefore in a Capitalist system, it gets shoved to the bottom of the stack while the high-margin slop gets pumped down our throats.

[โ€“] spongebue 5 points 5 days ago

Very well said. I think at the end of the day, the human element is too easy to overlook and that's a problem. We have one bot, a search engine, keeping an eye open for content. SEO wants to stand out for that bot, so it demands content (and in a certain way) be created so the search engine picks it up... But that takes effort, so we have another bot creating content to get the attention of another. And the thing a person wants just becomes an afterthought and dead Internet theory is that much more real

load more comments (1 replies)