Respond in only 200 words maximum 3 hours

Category: Engineering

The big human factors and ethical issue with autonomous cars is who is at fault when an autonomous car kills someone? Is it the owner (who may not be driving), is it the manufacturer? Here is a common case used in this discussion. It is likely to be rare, but nonetheless, it is a possibility. Imagine a driverless car with the owner in the front, not driving. The car sensor detects the possibility of a hazard ahead. It calculates the probabilities of different situations. One is to crash head on, killing the passenger (owner), but only the passenger (owner) dies. The other option, is to swerve the car and avoid the hazard that is directly ahead. This means skidding off the road, which will kill 5 pedestrians that are standing on the side of the road, but will save the passenger (owner). What should the car be programmed to do? Should it be programmed to save the most number of lives, or save the passenger, regardless of numbers? If it is programmed to kill the least number of people, would you buy a car that in this situation, is basically programmed to kill you if the situation arises? If it is programmed to kill as many as needed to save the passengers, could the car manufacturer be exposed to a manslaughter lawsuit from the families of the people on the side of the road?

Calculate the price of your order

You will get a personal manager and a discount.
We'll send you the first draft for approval by at
Total price:
$0.00
Pay Someone To Write Essay