Self-Driving Cars Navigate Complex Ethical Obstacles, Spark Personal Injury Concerns

The concept of self-driving vehicles is nothing new. In fact, they’ve existed commercially in some form or another since 2003, when automatic parallel parking was first unveiled. Today, Cadillac’s recently released ‘Super Cruise’ allows drivers to go hands-free on select highway routes, and drivers of certain BMW’s can let their car take over in the monotony of traffic.

Then of course there’s the much talked about Tesla Autopilot — but none of these technologies actually enable fully autonomous driving yet. For now, full autonomy remains elusive — plagued by a host of moral complications, and a whole lot of widespread fear.

In a May 2018 survey performed by the American Automobile Association (AAA), 73% of U.S. drivers reported that “they would be too afraid to ride in a fully self-driving vehicle”— a 10% increase over end-of-2017 findings. Additionally, two-thirds of adults indicate that sharing the road with autonomous vehicles as a pedestrian or cyclist would make them feel less safe. Generationally, the increase in fear is even more stark: “the percentage of Millennial drivers too afraid to ride in a fully self-driving vehicle has jumped from 49 percent to 64 percent since late 2017, representing the largest increase of any generation surveyed.”

Considering that the youngest generations have long since been the quickest to adopt new tech — a truth again born out when automated vehicle technologies were first unveiled — this decrease says a lot. The erosion of trust among the American public, and especially Millennials, comes on the heels of several high-profile accidents involving Tesla’s, each of which occurred while Autopilot was engaged.

Adding to the public fear are ethical and legal concerns, which have been made all the more complicated by industry-wide difficulties in establishing what constitutes autonomy in vehicles. In 2013 NHTSA released a policy aimed at initially establishing a definition of automation on a scale of 1-4, but these guidelines were superseded in 2016 by a new set of levels developed by SAE International (formerly the Society of Automotive Engineers). The SAE levels, now standard in the US and international, are as follows.


At SAE Level 0, the human driver does everything;

• At SAE Level 1, an automated system on the vehicle can sometimes assist the human driver conduct some parts of the driving task;

• At SAE Level 2, an automated system on the vehicle can actually conduct some parts of the driving task, while the human continues to monitor the driving environment and performs the rest of the driving task;

• At SAE Level 3, an automated system can both actually conduct some parts of the driving task and monitor the driving environment in some instances, but the human driver must be ready to take back control when the automated system requests;

• At SAE Level 4, an automated system can conduct the driving task and monitor the driving environment, and the human need not take back control, but the automated system can operate only in certain environments and under certain conditions; and

• At SAE Level 5, the automated system can perform all driving tasks, under all conditions that a human driver could perform them.


Where technology currently stands is at a level 3. Semi-autonomy. It’s the path to SAE 5 where legal and ethical issues become truly (and perhaps impassably) thorny. Everyone from engineers, to philosophers, to lawmakers worldwide are being forced to grapple with a thought experiment once purely hypothetical:  the Trolley Problem.

Consider the following:

You see a runaway trolley moving toward five tied-up (or otherwise incapacitated) people lying on the tracks. You are standing next to a lever that controls a switch. If you pull the lever, the trolley will be redirected onto a side track, and the five people on the main track will be saved. However, there is a single person lying on the side track. You have two options:

A. Do nothing and allow the trolley to kill the five people on the main track, or

B. Pull the lever, diverting the trolley onto the side track where it will kill one person.

Which is the more ethical option?

The real-life implications for autonomous vehicles are abundant:  in a crash scenario where death is unavoidable, and a computer must make a decision about who lives and who dies — including, potentially, you, the vehicle’s occupant — how do we as a society make that value judgement?

How do you quantify the value of two different lives without prejudice, and then have an engineer program this choice free of their own unconscious bias?

What is the morality of an auto company knowingly programming the future deaths of their customers, even if it’s in the name of public safety?

Should a driver’s life be given precedence simply because they happen to be the one inside the car?

Moreover, in an ensuing lawsuit, who exactly is at fault? The driver? The automaker? The engineer?

We do not yet have laws on the books to guide us through these ethical and legal quandaries, though congress is trying, and moral guidelines are still being developed worldwide. Even though autonomous vehicles are expected to make the roads safer for everyone overall, the concept of a machine deciding one’s death is too large of a stumbling block for most people. For some, it’s the lack of control. For others, the sheer uncertainty of it all.

Either way, personal injury litigation surrounding autonomous vehicle incidents is all but expected to rise, with the road to precedence as yet unpaved.

Have a serious injury and need legal advice?
Contact Howard Blau.

Ventura County’s Favorite Law Office

Check Out These References for Further Reading:

“Self-Driving Cars Will Likely Increase Product Liability Litigation.” Lexology. Retrieved 25 January 2019.

“American Trust in Autonomous Vehicles Slips.” AAA. Retrieved 25 January 2019.

“A Study on Driverless-Car Ethics Offers a Troubling Look into Our Values.” The New Yorker. Retrieved 25 January 2019.

“The Definitive Guide to the Levels of Automation for Driverless Cars.” WonderHowTo. Retrieved 25 January 2019.

2019-02-01T16:01:31-08:00January 28th, 2019|Auto Accidents, Driverless Cars, Personal Injury|0 Comments