The human does it out of self preservation, but the car doesn’t need to feel too preserve itself.
By getting the in the car, the passengers should be aware of the risks and that if there is an accident, the car will protect pedestrians over the occupants. The pedestrians had no choice but the passengers have a choice of not getting in the vehicle.
I feel like car manufacturers are going to favour protecting the passengers as a safety feature, and then governments will eventually legislate it to go the other way after a series of high profile deaths of child pedestrians.
You’re probably over-estimating the likelyhood of a scenario where a self driving car needs to make a such decision. Also take into account that if a self driving car is a significantly better driver than a human then it’s by definition going to be much safer for pedestrians aswell even if it’s programmed to prioritize the passengers.
Who would buy a car that will sacrifice the passengers in the event of an unavoidable accident? If it’s significantly better driver than a human would be then it’s safer for pedestrians aswell.
Yes. As it should be. I’ll buy the car that chooses to mow down a sidewalk full of pregnant babies instead of mildly inconveniencing myself or my passengers. Why the hell would you even consider any other alternative?
It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.
I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?
Tesla on their autopilot during night. All the time basically. There were number of motorcycle deaths where Tesla just mowed them down. The reason? They had two tail lights side by side instead one big light. Tesla thought this was a car far away and just ran through people.
That’s a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn’t know what was happening. This situation wouldn’t have engaged the autopilot in the way we are discussing.
As an aside, if what you said is true, people at Tesla should be in jail. WTF
This is also the company that promises to prioritise the vehicle occupants over pedestrians.
Linky
I mean that’s exactly what the driver would do, I’m not sure why this is controversial
The human does it out of self preservation, but the car doesn’t need to feel too preserve itself.
By getting the in the car, the passengers should be aware of the risks and that if there is an accident, the car will protect pedestrians over the occupants. The pedestrians had no choice but the passengers have a choice of not getting in the vehicle.
I feel like car manufacturers are going to favour protecting the passengers as a safety feature, and then governments will eventually legislate it to go the other way after a series of high profile deaths of child pedestrians.
You’re probably over-estimating the likelyhood of a scenario where a self driving car needs to make a such decision. Also take into account that if a self driving car is a significantly better driver than a human then it’s by definition going to be much safer for pedestrians aswell even if it’s programmed to prioritize the passengers.
Who would buy a car that will sacrifice the passengers in the event of an unavoidable accident? If it’s significantly better driver than a human would be then it’s safer for pedestrians aswell.
Yes. As it should be. I’ll buy the car that chooses to mow down a sidewalk full of pregnant babies instead of mildly inconveniencing myself or my passengers. Why the hell would you even consider any other alternative?
🤔
I’d consider a 5yr old a baby https://en.m.wikipedia.org/wiki/Lina_Medina
It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.
I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?
Tesla on their autopilot during night. All the time basically. There were number of motorcycle deaths where Tesla just mowed them down. The reason? They had two tail lights side by side instead one big light. Tesla thought this was a car far away and just ran through people.
That’s a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn’t know what was happening. This situation wouldn’t have engaged the autopilot in the way we are discussing.
As an aside, if what you said is true, people at Tesla should be in jail. WTF
Tesla washes their hands of any wrongdoing with terms of use where owner agrees he’s responsible bla bla bla.
Here’s a related video.