San Fernando Valley Los Angeles Attorneys
Navigation Two
Phone Number

Entries in autonomous cars (4)

Friday
Oct272017

Beep: Your Hamburger is Ready

CalBar Certified Franchise & Distribution Law Specialist

by Barry Kurtz

818-907-3006

 

Analysts predict robots may take nearly 40 percent of U.S. jobs – particularly the ones that require lower education levels – in the next 15-20 years. The study was released by PricewaterhouseCoopers, and was based on anticipated technological progress, which leaves a lot of room for error.

Not all jobs can be staffed with tech though, as Microsoft cofounder Bill Gates illustrates when talking about the fast-rising trend.

He supports a robot tax on businesses that use Artificial Intelligence (AI) to replace human employees. The revenues could be used to train people to fulfill other jobs, like those in teaching or elder-care, where a human component is much more important. It would also help slow the rush to acquire AI.

Gates told Quartz news outlet:

So if you can take the labor that used to do the thing automation replaces, and financially and training-wise and fulfillment-wise have that person go off and do these other things, then you’re net ahead. But you can’t just give up that income tax, because that’s part of how you’ve been funding that level of human workers...

Taxing questions aside, many hospitality franchisors find themselves facing a dilemma: Invest in people, or invest in tech?

Some Franchises Bet Heavily on Tech

Drone Delivery RoboticsReplacing humans with AI is still relatively expensive today, and not exactly a guarantee of larger profits at this time. But some franchises are throwing down a lot of chips on AI development. Why?

For one thing, restaurant jobs, particularly those around a hot grill or spitting fryer can be dangerous. And it’s hard work that not everyone wants.

Burger bot “Flippy” developed by Pasadena-based Miso Robotics with franchisor CaliBurger can cost upwards of $60k, according to TechCrunch. But it will never report to work tired, hungover, or in any state that may result in workers’ comp claims. It also won’t demand higher wages.

Domino’s delivers, as we all know. But food delivery can also be dangerous, which is one reason the chain invests heavily in tech.

The pizza franchisor developed Domino’s Robotic Unit (DRU), a four-wheeled autonomous delivery vehicle. Though DRU will be able to navigate the best route to a customer’s door, it has not yet been activated to serve, as the robot vehicle is still being tested. The DRU Drone is also in development, and may lift off for first deliveries in New Zealand in February.

Restaurant chain (not a franchise) Shake Shack opened a high traffic location in New York’s East Village. The restaurant features kiosk-only ordering and cashless transactions. When the food is ready, the diner can opt to receive a text message to pick up a tray, or go with the old school shout out.

Shake Shack still employs humans to provide customer assistance, cook, and expedite the food at this restaurant location.

Hoteliers are also embracing a new wave in AI services. AURA is the first robot in Asia to provide room services – delivering towels, water and more. And a Marriott in Belgium deployed a bot that addresses vacationers and business travelers in 19 different languages.

What Should Franchisors Know About AI?

Hi Tech Burger

As always, when considering upgraded tech for a chain, consider the needs of individual locations. Right now there are more questions than answers:

  • Will a loyal, profitable franchisee balk at having to deploy AI?

  • Will the leased or owned property of the franchisee accommodate robot workers?

  • Will the costs for AI create substantial reductions in comparison with employee-related expenses, and more profits for the operators?

  • Will the public accept AI as a substitute for smiling wait staff and cashiers?

  • Will ordering kiosks and robotic workers be a turn-off (in locales where unemployment rates might be very high), or a draw (like the hotel mentioned above attracting millennial clientele)?

But also remember that some AI can be used behind the scenes, e.g. in certain food prep tasks, to analyze marketing trends, fold hotel linens, etc.).

Whatever the industry, don’t be afraid to embrace the tech. Just be sure to look both ways before crossing fully into the digital realm.

Barry Kurtz is the Chair of Lewitt Hackman's Franchise & Distribution Practice Group.

Disclaimer:
This Blog/Web Site is made available by the lawyer or law firm publisher for educational purposes only, to provide general information and a general understanding of the law, not to provide specific legal advice. By using this blog site you understand there is no attorney client relationship between you and the Blog/Web Site publisher. The Blog/Web Site should not be used as a substitute for obtaining legal advice from a licensed professional attorney in your state.

Wednesday
Apr052017

"Look Mom No Hands!" But Lots of Rules for AI Cars in California

Personal InjuryPersonal Injury Attorney

 

by Andrew L. Shapiro

(818) 907-3230

 

You may have heard the news: An Uber Technologies Inc. autonomous vehicle was involved in an accident in Arizona. A human driver in a Honda CRV turning left at a yellow light hit the self-driving Volvo as it was crossing the intersection. Though the Volvo flipped onto its side after hitting a pole, no serious injuries were reported.

Accident investigators found the human driver to be at fault. The artificially intelligent (AI) vehicle was traveling just under the speed limit, and the employee “behind the wheel” stated he saw the Honda driver but did not have time to react.

Self-Driving Cars in California

An accident eyewitness thought whoever was driving the Volvo was “trying to beat the light” and hit the gas hard, which initially sounds as though the accident could have been avoided.

But an Uber spokesperson said their self-driving vehicles are programmed to maintain current speeds when approaching yellow lights, and to pass through if there is enough time to cross the intersection. If true, this only goes to show how eyewitnesses can sometimes perceive situations incorrectly. It also begs the question: 

As more AI vehicles take over the road, will human drivers have to adjust habits accordingly?

Here in Los Angeles, we’ve all seen two, three or even four cars turning left through an intersection after a light has changed to red – simply because those drivers had no opportunity to do so when their traffic lights were green or yellow. But if self-driving cars are programmed to go through without allowing the humans the belated though illegal opportunity to turn, we could see an increase in traffic accidents, injuries and fatalities for a while.

Who will be held liable? The drivers that force their way through the intersection when they don’t have the legal right of way? Or must other drivers yield the right of way to vehicles already in the intersection? We are all obligated to follow the letter of the law, even if it seems impractical.

The Silicon Valley State vs. the Motor City State

It’s interesting to note that Arizona is one of the states that impose few restrictions on companies wanting to road test their AI vehicles. Michigan’s robocar rules are also minimal – apparently, anyone (presumably with a valid driver’s license) who wants to drive an AI vehicle in that state may do so. Other states have followed suit to encourage driverless cars, on the theory they’re safer than human-operated vehicles.

California, home to Silicon Valley and many tech giants with skin in the AI game, is surprisingly a bit stricter. 

This state requires Autonomous Vehicle Testing Permits from the Occupational Licensing Branch (form OL 311), per Vehicle Code §38750. Autonomous testers must also submit either a Manufacturer Surety Bond (OL 317), or a Certificate of Self Insurance (OL 319), and certain company structures require submission of Articles of Incorporation, Corporate Minutes, and identities of key executives. Going to Michigan for testing certainly sounds a lot easier, if you can stand the snow.

California Road Trippin’

Autonomous Vehicle Accident LiabilityAbout 30 autonomous vehicle tech companies have applied to the CA DMV to test their autonomous vehicles on our streets and highways since 2014. Uber is one of the companies that initially refused to comply with the licensing requirements, and shipped their AI vehicles to more robot-friendly states. The Financial Times reports the ride-hail company is now licensed to drive driverless in the Golden State too.

According to the DMV, about 25 autonomous vehicle accident reports have been filed since the state first began permitting AI vehicle testing. Most of these accidents seem to be caused by human error – mostly humans driving non-autonomous vehicles.

When you look at the individual reports, you’ll see that many of them involve vehicles using Google technology – but upon reading the reports you’ll notice that most involve rear-end collisions where the AI vehicle was the one hit from behind.  

Self-Driving Accident Liability

As we saw above, most of the AI accidents in California were caused by humans, not technology. And it’s unclear at this point, should anyone suffer injuries or a fatality in these accidents who would be to blame. Clients engaging injury attorneys may find themselves going after the vehicle manufacturer, the software programmer as well as the vehicle’s owner.

And of course, passengers in autonomous vehicles who suffer injuries due to the error of human drivers, will have access to traditional remedies – going after the other driver and his/her insurance company.

 

Andrew L. Shapiro is the Chair of our Personal Injury Practice Group. 

Disclaimer:
This Blog/Web Site is made available by the lawyer or law firm publisher for educational purposes only, to provide general information and a general understanding of the law, not to provide specific legal advice. By using this blog site you understand there is no attorney client relationship between you and the Blog/Web Site publisher. The Blog/Web Site should not be used as a substitute for obtaining legal advice from a licensed professional attorney in your state.

Wednesday
Jan252017

“Way Mo” Autonomous Cars Coming Fast

Personal InjuryPersonal Injury Attorney

 

 

by Andrew L. Shapiro

(818) 907-3230

 

It’s a race to beat all races:  several car makers, including Ford, General Motors, Volvo, BMW and Tesla are promising fully autonomous vehicles within the next five years. Not far behind the pack, Google recently renamed its own autonomous contender the “Waymo”, according to Bloomberg Tech. And Nissan is big in Japan with plans to have commercial, driverless vehicles up and running on its home turf by 2020.

Recently, the U.S. Department of Transportation (DOT) announced that 10 sites across the nation were chosen for the testing of artificially intelligent (AI) vehicles. Two of these are right here in California – at the Contra Costa Transportation Authority in Walnut Creek, and the San Diego Association of Governments.

So what does all of this mean for driver safety?

It’s still too early to tell. For now, the general public can rest assured that the DOT’s designated test sites are meant to be just that – test sites. Automakers running cars at these locations are expected to share test results and tech knowledge per a Federal Automated Vehicles Policy released in September.

Transportation Secretary Anthony Foxx explained:

This group will openly share best practices for the safe conduct of testing and operations as they are developed, enabling the participants and the general public to learn at a faster rate and accelerating the pace of safe deployment. 

Autonomous Vehicle Safety

Last May a driver was killed in Florida when his autonomously driven Tesla crashed into a truck. The National Highway Traffic Safety Administration though, recently concluded that Tesla was not at fault. NHTSA said driver-assist software for the vehicle performed “as designed”, and that drivers should still pay attention when behind the wheels of AI vehicles.

The feds investigated other AI crashes and found that many of these were because of “driver behavior factors”.

Overall, even the insurance industry is gearing up for safer highways and streets. Once autonomous vehicles really get rolling, the industry expects a decline in driver insurance premiums, though it also expects an increase in product liability revenue. The reason?

Drivers involved in crashes will sue each other less and less, and will instead turn to car makers to satisfy injury claims.

 

Andrew L. Shapiro is the Chair of our Personal Injury Practice Group.

Disclaimer:
This Blog/Web Site is made available by the lawyer or law firm publisher for educational purposes only, to provide general information and a general understanding of the law, not to provide specific legal advice. By using this blog site you understand there is no attorney client relationship between you and the Blog/Web Site publisher. The Blog/Web Site should not be used as a substitute for obtaining legal advice from a licensed professional attorney in your state.

Thursday
Feb112016

Mile Marker: Google Beginning to Clear Legal Hurdles for Self-Driving Cars (but many more ahead)

Personal InjuryPersonal Injury Attorney

 

 

by Andrew L. Shapiro

(818) 907-3230

 

Self-Driving-Cars

 

Can self-driving vehicles (SDVs) use the carpool lane? That may be a legal question for another day, as SDVs still have barricades to overcome before moving to the fast lane and becoming available commercially for consumers. 

But the Federal Government opened the door – by giving serious consideration to, and taking the first steps in expanding, the previously unambiguous term “driver” – to include the Artificial Intelligence (AI) operating Google’s Self-Driving System, along with human motor vehicle operators.

The National Highway Traffic Safety Administration (NHTSA, or Administration) just responded to a November letter from Chris Urmson, Google’s Director of the company’s Self-Driving Car Project, which requested the interpretation of federal driving laws as they pertain to SDVs. Google hopes to make SDVs commercially available to the public by 2020, and that means making them compliant with Federal Motor Vehicle Safety Standards (FMVSS).

The NHTSA did grant some interpretations, but remains hesitant on others, citing a need for further Google SDV testing and further legislation in the future.

One reason for the Administration’s caution is that most FMVSS were written for vehicles of the past century, when all cars had human drivers sitting in the front left of a vehicle, with access to, and control of, steering and braking systems. The laws weren’t written to accommodate AI drivers or cars.

But the NHTSA did manage to favorably interpret some of Google’s questions re SDVs. They include:

1. Self-Driving Systems are drivers, in terms of certain operations like using turn signals and hazard signals; making transmission shifts; idling; parking and accelerating. 

2. Driver seatbelts may not be necessary, since the NHTSA interprets “driver” as the SDS in the case of Google’s proposed vehicle design: 

“It is possible that the provision as specifically written is not necessary for safety as applied to Google’s vehicle design, but Google has not demonstrated that in its present interpretation request.  FMVSS No. 208 would need amendment to clarify how a vehicle design like Google’s might comply with it.  One safety concern is that a human occupant could sit in any DSP [designated seating position], and that therefore the non-wearing of a seat belt by any occupant could create a safety risk.”  

3. Questions re Electronic Stability Control Systems (ESC) need further review, because the FMVSS mandate specific performance requirements for ESC systems. Though the NHTSA agrees that a Google-designed SDS is in fact the actual driver, the Administration feels a need to determine in future: 

“…how to evaluate the SDS control of the steering inputs, and whether and how to modify test conditions and procedures to address more clearly the situation of a vehicle with steering controlled entirely by an AI driver, with no mechanism for the vehicle occupants to affect the steering.”

A Futuristic Legislative Highway for Driverless Cars

Artificially Intelligent Vehicles

Self-driving test vehicles already cruise the streets. However, California DMV rules – where Google operates most of its prototypes – require human, licensed drivers to be inside with access to steering, brake and gas pedals. They must monitor the SDV’s operations at all times, and be ready to take control should technology fail or other emergencies arise.

Google’s November letter to NHTSA expresses concern that human error will make their completely autonomous SDVs unsafe should the humans try to override artificial intelligence.

The Administration acknowledged this concern, paving the way for years of further extensive testing and monitoring of vehicles completely controlled by AI with no human override capabilities.

But the proverbial genie is out of the bottle – Federal Rules will have to be modified to keep up with and include SDVs. There is much work to be done on both sides of this issue before we will have an SDV in our own garages – a prospect that is both scary and exciting for some of us.

 

Andrew L. Shapiro is the Chair of our Personal Injury Practice Group

 

Disclaimer:
This Blog/Web Site is made available by the lawyer or law firm publisher for educational purposes only, to provide general information and a general understanding of the law, not to provide specific legal advice. By using this blog site you understand there is no attorney client relationship between you and the Blog/Web Site publisher. The Blog/Web Site should not be used as a substitute for obtaining legal advice from a licensed professional attorney in your state.

LEWITT HACKMAN | 16633 Ventura Boulevard, Eleventh Floor, Encino, California 91436-1865 | 818.990.2120