this post was submitted on 19 Aug 2023
1389 points (98.4% liked)

Technology

62036 readers
4154 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] severien 4 points 2 years ago* (last edited 2 years ago) (1 children)

The intentionality is provided as a prompt by the human author.

[–] demlet 1 points 2 years ago (1 children)

Yeah, that's fair. It's hard to pinpoint what feels lacking with it, but it does feel lacking somehow to me. I guess for me there's probably a tipping point where it's no longer human enough. Like, just telling an AI to make a candy forest isn't enough. But that's a straw man argument in a way. Of course someone could put a huge amount of effort into getting an AI to render exactly what they're imagining. In the end, it could be seen as just another medium. I have no doubt people are going to find incredible ways of utilizing it.

[–] [email protected] 2 points 2 years ago

to render exactly what they’re imagining

Honestly... no. In practice it doesn't work like that because while messing about and getting the AI got generate what you want you look at tons of adjacent stuff the AI comes up with which then influences what you want to see. And I bet that's a thing that even the 4k nude stunning woman with (large breasts:1.6) faction experiences, it's practically impossible to not enter a dialogue with the tool.