Free Consultation
Available 24/7, Se Habla Español

The Safety, Moral and Legal Implications of Self-Driving Vehicles

self-driving vehicle legal implications
Skip Ahead
Table of Contents
plg-teslapost-04-1-300x300

Self-driving vehicles are here—driving alongside us. They are no longer a futuristic idea.  What this means from the safety, moral, and legal perspectives is an ever-growing topic of discussion—both locally and nationally. I am often asked, “you’re an attorney, what do you think?”  Well, here you go.

Safety and Morality Issues

A self-driving car recently malfunctioned in Kaufman, Texas—driving into a guardrail at 80 miles per hour.  That incident shed local light on self-driving car issues.  The current and future applications of artificial intelligence (including self-driving vehicles) was also the topic of a recent WIRED Magazine interview of Joi Ito (Entrepreneur and MIT Media Lab Director) and President Obama.

The WIRED article focuses heavily on the issue of morality.  As licensed drivers, we make moral decisions each time we get behind the wheel.  Sometimes those decisions are literally a matter of life or death.  The decision to use the fully automated function in a self-driving vehicle is also a decision to turn over moral decision making to a machine.  Should self-driving vehicles be programmed to minimize the potential death toll?  What if that means steering the vehicle into a wall (likely killing the owner and occupant of the vehicle) to save multiple pedestrians who are crossing the street?  Moral issues such as this one are the topic of some noteworthy current research:

The results are interesting, if predictable.  In general, people are comfortable with the idea that self-driving vehicles should be programmed to minimize the death toll.

This utilitarian approach is certainly laudable but the participants were willing to go only so far. “[Participants] were not as confident that autonomous vehicles would be programmed that way in reality—and for a good reason: they actually wished others to cruise in utilitarian autonomous vehicles, more than they wanted to buy utilitarian autonomous vehicles themselves,” conclude Bonnefon and co.

And therein lies the paradox. People are in favor of cars that sacrifice the occupant to save other lives—as long they don’t have to drive one themselves.

 

See Why Self-Driving Cars Must Be Programmed to Kill

On October 19, 2016, Tesla announced that “all Tesla cars being produced now have full self-driving hardware.” Huge announcement, OR IS IT?  Tesla’s announcement goes on to provide key details:

Before activating the features enabled by the new hardware, we will further calibrate the system using millions of miles of real-world driving to ensure significant improvements to safety and convenience. While this is occurring, Teslas with new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control. As these features are robustly validated we will enable them over the air, together with a rapidly expanding set of entirely new features….

See All Tesla Cars Being Produced Now Have Full Self-Driving Hardware

In my opinion, Tesla appears to be taking a step back.  At a minimum, they are acknowledging (1) potential safety issues with their first generation autopilot hardware, and (2) the need for extensive safety testing before rolling out the next generation.  Call me crazy, but I do not think the Tesla announcement will do anything to change what appears to be a reluctance by the general public to trust self-driving technology.

How Will Self-Driving Cars Affect Safety?

There are many ways self-driving cars could potentially impact the safety of other motorists as well as pedestrians and cyclists. While these vehicles offer an interesting glimpse into the future, they also come with serious risks and vulnerabilities that the technology has not yet addressed.

Some of these risks are external, like the reactions of other motorists. Others are internal, like defective vehicle parts. Understanding these risks is essential for anyone sharing the road with a self-driving car.

  • Loss of driver experience. Drivers learn specific skills after years of experience that self-driving cars cannot currently replicate. For example, self-driving vehicles might lack practical knowledge when traffic is likely to come to a stop due to construction.
  • Other drivers. Driving often requires sudden adjustments based on the acts of other motorists, including pedestrians. For example, a self-driving car might be prepared to avoid a rear-end collision with a parked car, but it may struggle with erratic drivers, defective brakes, or jaywalking pedestrians.
  • Changing road conditions. Changing road conditions can also cause problems with self-driving cars. These vehicles cannot yet adapt to sudden changes in weather, like heavy snow or rain. This lack of adjustment increases the chances of an accident.
  • AI training issues. Self-driving cars have to be trained by humans who are capable of making mistakes. That means errors and missteps in the process could impact the safety of a vehicle. For example, there is evidence that some self-driving cars struggle to identify the presence of pedestrians of different ethnicities.
  • Hacking. Self-driving cars may have vulnerabilities that hackers could exploit. This can happen in multiple ways, e.g., a driver’s personal information could be exposed when using a self-driving car. More seriously, someone else could take control of a self-driving vehicle and intentionally cause an accident.

We have all heard and pondered the question of “If a tree falls in the forest and no one is around to hear it, does it make a sound?”  What about the question of “If I am hit by a self-driving vehicle that is operating on auto-pilot, who is at fault?”  Is it the non-driver owner and/or occupant of the vehicle?  Is it the vehicle manufacturer?  Am I just out of luck?

I feel confident that the vehicle manufacturer will stand to be legally responsible for auto-pilot malfunctions.  But, even though he/she was not in control of the vehicle, I also believe that the owner and/or occupant of the self-driving vehicle remains legally responsible.  For your sake and mine, let’s hope I am right.

Why is holding the owner/occupier of a self-driving car legally responsible important?  Because pardoning the owner/occupier of self-driving vehicles from liability and limiting your legal rights to claims against the vehicle manufacturers would make justice cost prohibitive.  Claims against manufacturers are typically product’s liability actions—as opposed to general negligence actions.  Products liability actions are expensive, time consuming and complicated—hence the fact that they are cost prohibitive for many injured people.  The last thing our society needs is to further chip away at our already lacking access to justice.

So, how do I arrive at my opinion that the owner/occupier of the self-driving car is also legally responsible—even though he/she was not actually in control at the time of the Collision?  I ask the following question:  “Is using the auto-pilot function in a self-driving vehicle a negligent act?”  I say yes.

Don’t get me wrong, we trust our lives to technology in numerous instances—stop lights are automated, computers analyze and report on blood work, commercial planes have auto-pilot functions, etc.  So what makes self-driving vehicles different?  I think it comes down to the morality issue.  With self-driving vehicles, it is not merely a question of whether they will be dependable and work properly, it is a question of whether they will make proper moral decisions.

Is the general public ready to rely on machines to make moral decisions?  I say no.  Set aside your personal thoughts/beliefs about the technology and ask yourself, “Would a Texan of ordinary prudence (i.e. a man or woman of average intelligence) entrust his/her safety—and the safety of her fellow Texans on the road—to a fully automated driving process?”  Again, I say no, and the above-cited research backs me up.  If I am right (and I think that I am), then the use of auto-pilot is a negligent act.  Doing that which a person of ordinary prudence would not have done under the same or similar circumstances is negligence under Texas law.  More importantly, if I am right, our society will avoid a further decrease in the access to justice.

To fully explore your legal rights, you should contact a Fort Worth personal injury attorney.

– Tennessee W. Walker

Contact Patterson Law Group today for a free case evaluation! Our car accident attorney’s are here to help!

Share the Legal Know-How

Facebook
Twitter
LinkedIn
No Fee Unless We Win

Free Case Consultation

No Obligation - No Cost Unless We Win

"*" indicates required fields

This field is for validation purposes and should be left unchanged.
Blog & Social Media

Related Blogs

Search
Call or Text for Immediate Assistance
817.784.2000
Available 24/7, Se Habla Español