UPDATED 16:40 EDT / OCTOBER 11 2016

NEWS

Tesla denies German claims that autopilot is a ‘traffic hazard’

The German Federal Highway Research Institute (BASt) has released a new report that is highly critical of the autopilot feature in the Tesla Motors Model S, calling semi-autonomous mode a “considerable traffic hazard” and suggesting that the feature may not be safe enough to drive on public roads.

Now Tesla has denied BASt’s claims, and company founder and CEO Elon Musk said in a tweet that the report is “not actually based on science. Objective data shows Autopilot is safer than manually driven cars.”

One of the biggest hurdles facing the budding self-driving car movement is proving that the technology is safe and reliable. Although Tesla’s autopilot is not fully autonomous, any accident involving the feature is sure to draw close scrutiny from the industry.

That was the case earlier this year when a Tesla vehicle was involved in a fatal accident while autopilot was engaged. At the time, Tesla noted that despite the accident, autopilot is still safer than manual driving.

“This is the first known fatality in just over 130 million miles where Autopilot was activated,” Tesla said in response to an investigation by the National Highway Traffic Safety Administration. “Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”

The company added that drivers are made fully aware that autopilot is still in beta testing, and that they must agree to a confirmation every time they activate the feature. Tesla also informs drivers that they should keep their hands on the wheel at all times to be ready to manually take over if necessary.

The German government conducted the study in response to a recent accident involving a Tesla vehicle that collided with a bus on the Autobahn. Tesla subsequently released a statement denying that the car’s autopilot had any connection to the accident. Whether or not that is the caste, BASt seems to believe that the feature is unsafe, and the report notes several examples where the autopilot on the Tesla Model S failed to perform adequately.

For example, the report found that the vehicle sometimes ignored the yellow guiding lines on the road, and the autopilot sometimes followed the movements of the car ahead of it, even if there was another car to its side.

In a statement, Tesla reiterated that autopilot is still a new feature, and it cautions drivers to use it with care. “We have always been clear with our customers that Autopilot is a drivers assistance system that requires the driver to pay attention at all times,” Tesla said, according to Reuters.

German news publication Der Spiegel noted that BASt’s report was meant to be for internal use only, and the organization has not made any official recommendation regarding Tesla’s autopilot.

Image courtesy of Tesla Motors

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU