UPDATED 20:32 EST / FEBRUARY 29 2016

NEWS

It had to happen eventually: Google self-driving car partially blamed for bus accident

In a case of “it had to happen eventually,” a Google self-driving car has been partially blamed for an accident with a bus on Valentine’s Day, February 14.

In a statutory filing submitted (pdf) to the California Department of Motor Vehicles, Google Automotive, a branch of Google’s parent company Alphabet, Inc. said that the accident occurred while the self-driving Lexus was in autonomous mode eastbound on El Camino Real in Mountain View in the far right-hand lane approaching the Castro St. intersection.

“As the Google AV approached the intersection, it signaled its intent to make a right turn on red onto Castro St,” the report reads. “The Google AV then moved to the right-hand side of the lane to pass traffic in the same lane that was stopped at the intersection and proceeding straight. However, the Google AV had to come to a stop and go around sandbags positioned around a storm drain that were [sic] blocking its path. When the light turned green, traffic in the lane continued past the Google AV. After a few cars had passed, the Google AV began to proceed back into the center of the lane to pass the sandbags. A public transit bus was approaching from behind.”

“The Google AV test driver saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google AV to continue,” the report goes on. “Approximately three seconds later, as the Google AV was reentering the center of the lane it made contact with the side of the bus. The Google AV was operating in autonomous mode and traveling at less than 2 mph, and the bus was traveling at about 15 mph at the time of contact.

“The Google AV sustained body damage to the left front fender, the left front wheel and one of its driver’s -side sensors. There were no injuries reported at the scene.”

Human test

This isn’t the first case of a Google self-driving car being involved in an accident (they’re actually so conservative they get pulled up for driving too slowly), but it is the first case where the vehicle was at least partially at fault. And while it’s clear that it had to happen eventually, what it does prove is that when push comes to shove, the car has failed the human test.

Google can build as many artificial intelligence algorithms into the vehicles as they can, but ultimately humans don’t always do things logically, and in this case the vehicle presumed that the much larger bus would yield (which is logical given the Google car was in front of it) whereas many people would know that larger vehicles on the road (trucks, buses etc) often don’t as they both have the advantage of size and dominance, as well as also being far harder to maneuver in quick or emergency situations.

The one good thing that comes out of the accident is that Google now has an example of the fallacy of presuming logical outcomes from human drivers, and may be able to program their self-driving vehicles to be even more safer than they currently are.

Image credit: markdoliner/Flickr/CC by 2.0

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU