Tuesday, June 4, 2024

June 3: Self-Driving Cars

 

Self-Driving Car Dilemma 

The ethics of programming self-driving cars to make the best decisions is a difficult task.

How will self-driving cars choose how to resolve conflicts on the road? Well, the decision is actually up to the programmers. Today we discussed what a self-driving car should do in the event that large debris falls off of a truck in front of the car, given that there is not enough time to hit the brakes.
 In this scenario, the car has the option to A) Hit the debris and sacrifice the driver, B) Swerve left into an SUV, or C) Swerve right into a motorcycle. If I was making the decision, I would swerve left into the SUV as it is a vehicle with a high safety rate and would likely protect the automated car driver while not causing too much harm to the SUV driver. 
This option creates a balance between saving the automated car driver and preventing casualties. However, solutions like this bring the complication of deliberately deciding to hit another vehicle on the road, which makes the decision unethical. These self-driving vehicles would only be safe if every vehicle was self-driving, so I believe that vehicles must first implement driving assistance to transition to automated vehicles. These assists include automatic braking, lane warnings, cameras, and cruise control, which can lead to further developments. 
This scenario has led me to rethink other ethical aspects of technological advancements, such as how these cars might unintentionally endanger others.















No comments:

Post a Comment

Coal versus Nuclear: Why You Shouldn’t Be Scared of Nuclear Power

Is nuclear energy safe? When considering clean sources of energy in comparison to coal, most people gravitate towards solar, wind, and hydro...