this post was submitted on 01 Aug 2023
525 points (82.1% liked)
Technology
59578 readers
6111 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
User error, be more specific
Yeah they forgot to say "don't change my ethnicity" to the prompt. Normal shit, right?
Yes. Or even better, just add "asian" to the prompt. It's just a tool and tools are flawed.
"Don't change my ethnicity" would do nothing, as these programs can not get descriptions from images, only create images from descriptions. It has no idea that the image contains a woman, never mind an Asian woman. All it does is use the image as a starting point to create a "professional photo". There absolutely is training bias and the fact that everyone defaults to pretty white people in their 20-30s is a problem. But this is also using the tool badly and getting a bad result.
It would be the same if the user wanted to preserve or highlight any other feature, simply specify what the output needs to look like. Ask for nothing but linkedin professional and you get the average linkedin professional.
It's like being surprised the output looks asian when asking to look like a wechat user
If you don't want your ethnicity to change, I would say its vital.