this post was submitted on 02 Aug 2024
301 points (98.7% liked)
Technology
59982 readers
4137 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes, you probably are. Please don't forget that the current available technology constantly improves, and that we actually don't see any good examples of self - driving cars that much - the most prominent displays are from Tesla, and they arguably build the worst cars we've seen since Ford came up with the assembly line.
The technology used in autonomous vehicles, e. g. sensors, have been used in safety critical application for decades in other contexts, and a machine is capable of completely different reaction times. Also, if autonomous vehicles cooperate in traffic sticking to their programmed behavior, observing traffic rules etc., you will get less reckless driving, with traffic flow becoming more deterministic. These benefits will particularly increase once self-driving cars don't have to share the road with human drivers.
I would always trust a well-engineered, self-driving car more than one driven by a human.
Disclaimer: I used to work on these things in a research lab. Also, we're not quite there yet, so please have a little patience.
What about things on the road that are not automated? There will be situations where a machine’s ethics might override a human driver’s ethics. It would be good for us to be able to override the system and know how to safely drive in emergencies.
It's not about everything being automated. We also have to differentiate between early incarnations of autonomous vehicles and the desired, final state.
A manual override will of course be part of early models for the foreseeable future, but the overall goal is for the machine to make better decisions than a human could.
I don't have any quarrel with life or death decisions being made by a machine if they have been made according to the morals and ethics of the people who designed the machine, and with the objective data that was available to the system at the time, which is often better than what would be available to a human in the same situation.
It's the interpretation of said data that is still not fully there yet, and we humans will have to come to terms with the fact that a machine might indeed kill another human being in a situation where acting any different might cause more harm.
I don't subscribe to the notion that a machine's decision is always worth less than the decision of another entity participating in the same situation, just because it so happens that the second entity happens to be a human being.
Not having control of a vehicle in a life or death situation is terrifying to me. I probably trust my driving more than most, and trust my decisions over those decided by a corporation beholden to rich investors.
I’m worried about the growing pains before we get to the ideal state, and that would have to be full autonomy of everything on the road so nothing that enters the space can collide with another, or if they do, it’s not dangerous.
But then guess what? People will be able to pay for the fast lane. Or a faster rate of speed. You make a whole economy out of trying to get to work, trying to go to a wedding, trying to go anywhere. I don’t trust it, but I get it.
I think you're falling into a bit of a trap here: perfect is the enemy of good. Not everything has to be automated, instead of growing pains, there can also be gains.
Remember, we are currently aiming to get these vehicles on the road, alongside regular drivers. They use sensors and computer vision to read street signs, detect people etc., all with the reaction speed of a machine. What if the in-between product is simply a better driver with faster reaction times? That is the current goal, really - no one wants to automate everything, simply because that wouldn't be feasible anytime soon.
Yes, again, we're not there yet and these things are far from perfect. But let's first just aim to get them good enough, and then maybe just a little better than your average driver.
As for the your proposed business model: we have capable drivers now, why do these business models don't exist right now? Why is there no fast lane that allows me pay to get to my destination faster? What would the technology of driverless cars introduce that would enable these regulations?
You’ve misunderstood me and we’re getting off topic. The main point is that “good” is where the eggs are cracked to get to a “great” omelette.
We have toll roads today. You pay for faster travel. Automation of vehicles introduces much easier access into controlling you vehicle and a lot more variables able to be controlled.
Are you terrified of riding on trains or flying in planes? You don't have control of the vehicle in those cases either and those are both considered far FAR safer than you driving a car.
Not having control of the plane in a life or death situation is terrifying. Train not so much.
Driving a car can’t be compared to those two. So much more traffic and people/objects in the way. Only comparable to something like a bus in a designated bus lane.