top of page
Search

Autonomous Vehicle Ethics

  • charlie0676
  • Jul 5
  • 2 min read

ree

With the rapid advances in autonomous vehicle technology, it will be no wonder for future humans to look back on driving as a primitive activity. By the year 2050, children may find it hard to believe that their parents had to steer weighty vehicles through traffic at high speeds. In this envisioned future, technology replaces the need for a human driver in vehicles, allowing cities to integrate an ultra-efficient autonomous transportation system. Driving may become a hobby adopted by enthusiasts, much like horse riding, both recreational activities rather than a necessity.  

 

Self-driving vehicles will unlock a new future for society. Elders who previously struggled to drive due to reduced physical capabilities and mental faculties can have continued transportation. The elderly will have new opportunities and more freedom to live their lives independently. Teenage drivers will have a safer experience as they learn to drive. It will also unlock valuable time for people to do work during their commutes instead of focusing their energy on driving. 

 

When ethically thinking about self-driving cars, the infamous trolley problem may arise. Consider a situation where a self-driving car must choose who to protect in the situation of an unavoidable crash; should it protect its passenger, or the pedestrian? Most humans do tend to prioritize their own interests above others. But to a 3rd party such as a car, where all human lives are seemingly equal, whom should it strive to protect? 

 

A utilitarian would argue that self-driving cars must be programmed to minimize suffering in any situation.  The cars should take action to evade the pedestrians, even if it means bringing harm to the passenger. This would minimize suffering and be morally justified from a utilitarian perspective. 

 

Contrary to the utilitarian argument, a deontologist would assert that since self-driving cars are capable of rational thought and moral reasoning, they should be held to the same moral duties to society as humans. These include objective principles such as killing is wrong, without any question. A car programmed with deontology as a priority should adhere to these principles; it should avoid killing the passenger at all costs since killing is wrong, even if it creates a more desirable outcome.  

 

Autonomous vehicles are not yet programmed to make moral decisions as of 2025, instead just to minimize vehicular damage based solely on physics. But as these systems are trusted with increasingly important ethical situations, we must confront the question of whose rules do we follow? Navigating the road with technology is more than just using sensors and processing data; it is about establishing moral principles to guide our technological vision. 

 
 
 

Comments


bottom of page