Comments
-
@eleven, the only one at fault for the deaths would be the company. If their programming was sufficient then they would be able to avoid the accident. Lets give and example, going down a street and there are cars on parked on the sides of a road. A kid jumps out from behind a car into the front of a car. The car instantly calculates it will/can stop in 15 feet, the child is 9 feet away. The impact force at 9 feet is also calculated to be enough to kill the child. What should the car do? A. Hit the child B. Do not apply brakes make a sharp right turn to hit vehicle beside you and stop in 8 feet. C. Do nothing
-
@Cpzombie, not arguing the best course of action i am merely stating one is an accident the other is not. U can compound this example to a car and a bus filled with kids. One hydro planes, an accident is sure to be caused, the hydro plaining vehicle (the bus)cannot move so the car must take action. It can either run into a barrier which will flip the car but the bus wont be hit. Or it could hit the bus saving the driving life since the impact will be less but the bus will flip and take severe damage. There is no right or wrong answer. The only potential answer could be to save the most lives possible.
-
This is actually a really interesting moral question. Of course most most people would say that if the options for a self driving vehicle were to crash into a truck and kill the occupant or swerve and hit the two children playing by the road then the self driving car should sacrifice the person sleeping on the back seat. But then people wouldn’t buy them. If manufacturers sold them with the choice of programming people would choose the o e that’s gonna save their life and kill the kids so manufacturers will make the vehicle prioritise protecting occupants
Would self-driving BMW’s use turn signals?