“Look Mom No Hands!” But Lots of Rules for AI Cars in California

Headshot of Andy Shapiro

Andrew L. Shapiro | Shareholder

April 5, 2017

Personal Injury Attorney

by Andrew L. Shapiro
(818) 907-3230

You may have heard the news: An Uber Technologies Inc. autonomous vehicle was involved in an accident in Arizona. A human driver in a Honda CRV turning left at a yellow light hit the self-driving Volvo as it was crossing the intersection. Though the Volvo flipped onto its side after hitting a pole, no serious injuries were reported.

Accident investigators found the human driver to be at fault. The artificially intelligent (AI) vehicle was traveling just under the speed limit, and the employee “behind the wheel” stated he saw the Honda driver but did not have time to react.

Self-Driving Cars in California

An accident eyewitness thought whoever was driving the Volvo was “trying to beat the light” and hit the gas hard, which initially sounds as though the accident could have been avoided.

But an Uber spokesperson said their self-driving vehicles are programmed to maintain current speeds when approaching yellow lights, and to pass through if there is enough time to cross the intersection. If true, this only goes to show how eyewitnesses can sometimes perceive situations incorrectly. It also begs the question:

As more AI vehicles take over the road, will human drivers have to adjust habits accordingly?

Here in Los Angeles, we’ve all seen two, three or even four cars turning left through an intersection after a light has changed to red – simply because those drivers had no opportunity to do so when their traffic lights were green or yellow. But if self-driving cars are programmed to go through without allowing the humans the belated though illegal opportunity to turn, we could see an increase in traffic accidents, injuries and fatalities for a while.

Who will be held liable? The drivers that force their way through the intersection when they don’t have the legal right of way? Or must other drivers yield the right of way to vehicles already in the intersection? We are all obligated to follow the letter of the law, even if it seems impractical.

The Silicon Valley State vs. the Motor City State

It’s interesting to note that Arizona is one of the states that impose few restrictions on companies wanting to road test their AI vehicles. Michigan’s robocar rules are also minimal – apparently, anyone (presumably with a valid driver’s license) who wants to drive an AI vehicle in that state may do so. Other states have followed suit to encourage driverless cars, on the theory they’re safer than human-operated vehicles.

California, home to Silicon Valley and many tech giants with skin in the AI game, is surprisingly a bit stricter.

This state requires Autonomous Vehicle Testing Permits from the Occupational Licensing Branch (form OL 311), per Vehicle Code §38750. Autonomous testers must also submit either a Manufacturer Surety Bond (OL 317), or a Certificate of Self Insurance (OL 319), and certain company structures require submission of Articles of Incorporation, Corporate Minutes, and identities of key executives. Going to Michigan for testing certainly sounds a lot easier, if you can stand the snow.

California Road Trippin’

Autonomous Vehicle Accident Liability

About 30 autonomous vehicle tech companies have applied to the CA DMV to test their autonomous vehicles on our streets and highways since 2014. Uber is one of the companies that initially refused to comply with the licensing requirements, and shipped their AI vehicles to more robot-friendly states. The Financial Times reports the ride-hail company is now licensed to drive driverless in the Golden State too.

According to the DMV, about 25 autonomous vehicle accident reports have been filed since the state first began permitting AI vehicle testing. Most of these accidents seem to be caused by human error – mostly humans driving non-autonomous vehicles.

When you look at the individual reports, you’ll see that many of them involve vehicles using Google technology – but upon reading the reports you’ll notice that most involve rear-end collisions where the AI vehicle was the one hit from behind.

Self-Driving Accident Liability

As we saw above, most of the AI accidents in California were caused by humans, not technology. And it’s unclear at this point, should anyone suffer injuries or a fatality in these accidents who would be to blame. Clients engaging injury attorneys may find themselves going after the vehicle manufacturer, the software programmer as well as the vehicle’s owner.

And of course, passengers in autonomous vehicles who suffer injuries due to the error of human drivers, will have access to traditional remedies – going after the other driver and his/her insurance company.

Andrew L. Shapiro is the Chair of our Personal Injury Practice Group.

Disclaimer:
This Blog/Web Site is made available by the lawyer or law firm publisher for educational purposes only, to provide general information and a general understanding of the law, not to provide specific legal advice. By using this blog site you understand there is no attorney client relationship between you and the Blog/Web Site publisher. The Blog/Web Site should not be used as a substitute for obtaining legal advice from a licensed professional attorney in your state.

SEARCH

CATEGORIES

disclaimer

This Blog/Web Site is made available by the lawyer or law firm publisher for educational purposes only, to provide general information and a general understanding of the law, not to provide specific legal advice. By using this blog site you understand there is no attorney client relationship between you and the Blog/Web Site publisher. The Blog/Web Site should not be used as a substitute for obtaining legal advice from a licensed professional attorney in your state.

© 2024 Lewitt Hackman. All rights reserved. | Attorney Disclaimer | Privacy Policy Site design by ONE400Opens in a new window
x
x

Error: Contact form not found.