Ethical Questions in Code Writing

When considering self-driving cars and trucks, one question most people have relates to what is often referred to as “ethical algorithms,” meaning what choice the car would be programmed to make in an ethical dilemma. If 5 pedestrians suddenly appeared in front of the vehicle and there was no time to brake, would the vehicle run into the pedestrians or veer into the neighboring concrete wall, thus putting the driver/rider at risk? If an accident was imminent, would the vehicle’s computer system choose to run over 6 adults or 3 children? A recent survey showed that while most people believe a car should sacrifice its passengers for the greater good, most people would choose to buy a car that always protected its riders at any cost if given the option.

There is a strong argument that these grand ethical dilemmas are actually irrelevant when considering the development of code for autonomous cars. For one thing, these situations almost never come up in the real world. How often have you, as a driver, had to choose between running into 6 adults or 3 children? Further, the advanced camera systems on driverless cars will see things far earlier and with more precision than a human would. If there are pedestrians ahead, a self-driving car will recognize and respond to that so far in advance that, theoretically, the car will have more than enough time to brake and avoid an incident.

At least one car company has come out to say that it will program its cars to always protect the passengers in the vehicle. In 2016, Mercedes-Benz stated that in a situation where either pedestrians or the driver would be put at risk, its Level 4 and 5 vehicles will be programmed to save the driver. The company also pointed out, though, that the goal is to improve technology to the point that these scenarios almost never arise.

For an article going into more detail about why “ethical algorithms” are unimportant for self-driving cars, see http://techcrunch.com/2015/11/23/the-myth-of-autonomous-vehicles-new-craze-ethical-algorithms/?ncid=rss.

Ethical questions, however, can also arise on a smaller level. Will the driverless car swerve for deer? For dogs? For squirrels? Will the autonomous car attempt to avoid all animals, just some animals, or no animals? Animals typically come into view very quickly, so the advanced camera system would likely not put the car at an advantage by seeing an animal significantly earlier than a human would. Would the car swerve if there were no cars around but hit the animal if another car were nearby? Who would make the decision about how such code should be written?

Or consider road debris. If an autonomous car is traveling behind a trailer and a piece of furniture falls onto the road, does the car swerve, thereby putting other vehicles at risk, or hit the furniture, putting its own driver/rider at risk?

Ethical dilemmas can involve non-safety issues as well. For instance, if one car is attempting to pull out of a parking lot onto a street with heavy traffic, will other autonomous cars stop to allow that vehicle in? Will the cars automatically make such a decision, or will driverless cars take into consideration their own driver/rider’s schedule and timeliness before foregoing the right of way?

The potential hypotheticals here are endless, and it’s impossible to plan for them all. As autonomous cars become more prevalent, though, code writers and car manufacturers will be forced to make tough decisions about how a driverless car would react in a variety of situations. The impact on travel will be significant, particularly in the transitional period when the road is occupied by Level 1, Level 2, Level 3, and Level 4 vehicles.

Car manufacturers will also have to consider how the ethical codes will be communicated to consumers. At least some in the field argue that consumers should be told how their car is programmed to respond to an imminent crash, but it is unclear how such a disclosure could be made. Would the information be contained in an owners' manual? Would there be a user agreement that a consumer had to accept before operating an autonomous car? Would there be any way to ensure the consumer actually read the agreement, rather than scrolling through and clicking "agree?" For an interesting article on this topic, visit this article

Client Reviews
★★★★★
Everything was great. You guys are a great representative. I was satisfied with everything. Truly appreciate John Day and his hard-working staff. Jamar Gibson
★★★★★
We thought that you did an excellent job in representing us in our lawsuit. We would recommend you to anyone. Mitch Deese
★★★★★
The Law Offices of John Day, P.C. is, without a doubt, the best in Nashville! They treated me with the utmost respect and tended to my every need. No question went unanswered. I was always kept informed of every step in the process. I received phenomenal results; I couldn't ask for more. I would definitely hire The Law Offices of John Day, P.C. again. Anthony Santiago
★★★★★
I would definitely recommend to anyone to hire John Day's law firm because everyone was helpful, made everything clear and got the job done. I am satisfied with how my case was handled. June Keomahavong
★★★★★
It's been a long battle but this firm has been very efficient and has done a remarkable job for me! I highly recommend them to anyone needing legal assistance. Everyone has always been very kind and kept me informed of all actions promptly. Linda Bush
★★★★★
I had a great experience with the Law Offices of John Day. The staff was very accommodating, and my phone calls/emails were always responded to in a timely manner. They made the entire process very easy and stress-free for me, and I had confidence that my case was in good hands. I am very happy with the results, and I highly recommend! Casey Hutchinson