this post was submitted on 16 Feb 2024
265 points (100.0% liked)
Technology
59223 readers
2752 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Hundreds in this case, but millions in the long term.
I can see why Air Canada wanted to fight it, because if they accept liability it sets a precedent that they should also accept liability for similar cases in future.
And they SHOULD accept liability, so I'm glad Air Canada lost and were forced to!
The solution would be easy, just stop having an LLM chatbot.
But I suspect they don't want to because someone sold them on how good and cheap and human-resource-free it was, and now they think they're too invested.
Plus just the general sentiment that you're not businessing right if you don't something something AI.
Feel my blood boiling at the very thought of people choosing to use something buzzwordy like blockchain or "AI", despite likely no competent person advising them to employ that, AND then trying to clean themselves of the responsibility when it misfires.
That's as if drunk driving leading to car crash was blamed on the air, because "having fun is not a crime".
Only with computing these people unironically think that nobody should be responsible, because everybody they respect is as clueless as themselves, so "nobody knows how it works, it's a frontier, see".
Yeah great point. Or similar cases in the past, even.