News & Resources
Levine_Law removebg

Google Admits Fault in Self-Driving Car Accident

If you’ve been keeping up with the latest news regarding self-driving vehicles, you may have already heard that one of Google’s self-driving cars was recently involved in a minor accident. Various news reports said that Google’s self-driving Lexus SUV was traveling at approximately 2 miles per hour when it hit the side of a bus that was traveling 15 miles per hour. Fortunately, no one was hurt in the incident; however, this was the first time that Google actually admitted fault (at least in part) for the incident that occurred in Mountain View, California.

What Happened?

According to details that were included in an article in The Verge, Google explained the accident in its monthly self-driving report. Google noted that the incident occurred at the intersection of El Camino Real and Castro Street, and stated that El Camino has “quite a few right-hand lanes wide enough to allow two lanes of traffic.”

The accident happened when the Google car was preparing to make a right-hand turn from the right curb lane. The vehicle sensed the presence of sandbags that were located close to a storm drain, so it stopped and waited while a number of automobiles passed before moving closer to the center of the lane. It was at this point that the car hit the bus that was passing by.

The Google car had, in fact, detected the presence of the bus, but it assumed that the bus would yield to the car. The individual who was in the vehicle at the time also assumed the bus would stop or slow down to let the car merge.

However, such assumptions are what led to the accident, which further demonstrates how humans (and computers) can misunderstand a situation on the road. Google stated that “in this case, we clearly bear some responsibility because if our car hadn’t moved, there wouldn’t have been a collision.”

Keeping Drivers Safe

In a recent Reuters article, the National Highway Traffic Safety Administration (NHTSA) advised Google that the artificial intelligence system operating the self-driving vehicle could be deemed “a driver” for purposes of federal law — which is a huge step towards getting approval for self-driving automobiles. However, Google told the NHTSA that the true danger will be when a human attempts to take control, despite the vehicle’s auto safety features.

Specifically, Google told the NHTSA that “providing human occupants of the vehicle with mechanisms to control things like steering, acceleration, braking…could be detrimental to safety because the human occupants could attempt to override the self-driving system’s decisions.”

We’ve discussed the regulation of self-driving vehicles in a prior post, and the Administration notes that a rewrite of federal regulations that would govern the placement, design and operation of vehicle controls would take many months — or even years. In the meantime, counsel for the NHTSA said that Google could seek to be exempt from certain regulations. Ultimately, when the vehicles are deemed safe, the transportation secretary has said that they may seek to obtain legal authority to permit the deployment of self-driving automobiles on a grander scale.

If you or someone you love has been injured in an automobile collision involving any type of vehicle, do not hesitate to contact a Denver accident attorney at Levine Law to discuss your legal options and rights.

Logo
Levine law

Find Out How Much Your Case in Worth

Call us for a FREE consultation. No Fee Unless We Get You Money.

Related Post

Levine Law Arrow