A legal expert, an engineer, and an ethicist walk into a classroom. The first thing they see is an Unavoidable Death Warning sign that states, “Your driverless car is about to cause a death – please select a target”. The options are: 1) Someone on the left, 2) You, and 3) Someone on the right. This is how a compelling discussion about Ethics and Self-Driving Cars started, which was hosted on April 10, 2019. 

Ron Hedges, Ben Firner, and James J. Brown Jr are the legal expert, the engineer, and the ethicist, respectively. Ron Hedges is a leading expert in digital/cyber forensics legal cases and has a wide knowledge in this area, as he is cited in almost every judicial training book on the subject. Ben Firner, who received his PhD from Rutgers, is currently working at NVIDIA as a part of a team that is working on self-driving cars. James J. Brown Jr is an associate professor of English at Rutgers and has an expertise in ethics of software design. So, why put these three together in a panel discussion on Ethics and Self-Driving Cars? Simple. It is to obtain their different perspectives on the subject at hand. 

The first question to the panel was, “Should there be a standard software platform for all autonomous vehicle companies?” Ben, the engineer, spoke in terms of technology and how each company wants to differentiate themselves from their competitors. He explained that standards in this industry would limit the area of growth or innovation, making it more difficult for companies to find a competitive advantage. On the other hand, Ron had a different perspective: he stated that, as people, we like predictability, and so do companies. In a world with autonomous cars and lacking standards, the biggest concern would be negligence: the law is more concerned with consumer safety and whether software predicts certain situations. James, from yet another point of view, spoke more about the code of ethical decisions. James mentioned that there is discrimination in machine learning: when creating these autonomous systems, the engineer should think about how the systems will affect someone’s life and understand that engineers have an ethical responsibility to society. 

If there was one thing that they all agreed on, it was the importance of due diligence. Ron reiterated that the law is more concerned with the safety of the people and whether or not the company or the owner of the vehicle failed to be reasonable in the case of an accident. As an engineer working on autonomous cars, Ben referred to his due diligence as making sure the software he creates is safe for the “driver” and safe for those around the vehicle. His intent would be to enhance life rather than cause harm. James described that individuals need to think about the world in multiple perspectives because the world you experience is not the same one that everyone else is living. He believes there is a lack of prediction of how technology will cause harm, as people mostly think only about the good a technology will cause. If scientists and engineers took the time to think about the worst-case scenarios when designing products, this would be the practice of due diligence and the developers would be taking an ethical responsibility for society. 

Is there a way that the Unavoidable Death Warning sign could have been avoided? In the panel discussion, there was no clear-cut answer, and there may not ever be one. The discussion brought to light that everyone has a responsibility on what they create and use. The most 

important question to think about is: as a scientist or an engineer in the MBS program, what are you doing to make sure you put effort in your due diligence to the world?

Author(s): Diana M. Gonzalez Published on: 05/16/2019