this post was submitted on 06 Dec 2024
350 points (98.3% liked)
Technology
59984 readers
2646 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Jesus Christ, I knew this was a problem with image editing software back then, but I never knew, that GPU manufacturers fucked it up as well. How did this happen?
I have a good guess on how this would actually happen:
PM: We need this
Specialist: makes this (doesn't check results)
QC: Looks good (but doesn't actually check)
Some updates later may further break the functionality. And as long as numbers aren't blatantly wrong (think 0s everywhere, for example) and nobody checks thoroughly enough, the issue will remain.
I have unfortunate experience of being a part of such a story, haha. There are ways to counter it. Mainly, their project documentation either wasn't up to par or wasn't used as a reference during creation and tests. Either way, it's negligence.
I imagine in case of GPU design, there should be a bunch of tests for image correctness at some point, which would require pixel perfect reproduction to pass.
But it's plausible that tests were running incorrect math too.
"They are just human after aaaall 🎶"
Laziness
Linear is probably a lot faster?