Can You Trust Self Driving Cars?

While the topic of the day at the just concluded 2016 Consumer Electronics Show were autonomous cars, you may be forgiven for thinking they will be launched for public use next year. That is highly unlikely if not improbable. Even as production vehicles promise to produce more reliable semi-autonomous capacities, a fully autonomous car is still a long way from being a reality. The California “disengagement” reports are evidence enough that fully autonomous cars are still a long way in the future. We have to ask can you trust self-driving cars? given these damning numbers.

According to California regulations, all manufacturers that test autonomous vehicles are obligated to report every time the human relieves the computer of control of the vehicle. The taking over of control of the vehicle from the computer by a human is referred to as “disengagement”. ‘All reports of disengagements provided by vehicle manufacturers are made public’. An ‘original report from Autoblog’ asserted that a total of 2894 disengagements from seven manufacturers had been reported. These are Bosch, Volkswagen, Delphi Automotive Systems, Tesla, Google, Mercedes-Benz, and Nissan. Google with many more test vehicles led in disengagements at 341 while Tesla had none. The discrepancy in disengagements make you ask, can you trust self-driving cars?

While this may seem damning and a cause for concern for self-driving cars, it is not quite as bad as it seems. A quick analysis of the DMV chart of Google’s tests shows marked improvement over the past year. In November of 2014, while driving for 15,386.6 miles on public roads, Google reported 21 disengagements. After driving for 43,275.9 miles in November 2015 which is about 3 times the miles driven in November 2014, Google reported only 16 disengagements. The trend shows steady improvement for Google over time.

Furthermore, disengagements do not necessarily imply that the car was in a dangerous situation or that it crashed. A human interjecting unnecessarily or a faulty sensor may trigger a disengagement. According to a Google report regarding disengagement, in order to promote safety, engagement levels are conservative. It also said that all of this information and testing is a part of learning and mastering how best to produce self-driving cars. Begging the question, if the manufacturers are still learning, can you trust self-driving cars?

Google said “Disengagements are critical in testing as it is the only way that engineers can identify areas of weakness and improve the capacities of the vehicles’ software. The objective of the company is not to keep disengagements at a minimum but rather to collect as much information as possible while operating safely, which will facilitate the improvement of the autonomous driving system. As such, the thresholds of disengagement are conservative and are individually recorded for continued research.”

‘It is evident’ that the ‘future of transportation’ will be influenced to no small extent by semi-autonomous vehicles. Nevertheless, it has to be acknowledged that the variables that an autonomous car needs to be able to deal with before they can be capable of adequately replacing a human drive are quite significant. As vehicle manufacturers are ‘coming to realize’, the closer we get to actualizing the dream the harder it seems to get. Taken in this context, maybe the best news of the reports is the significant progress made by Google in one year. But it still does not answer the question, can you trust self-driving cars?

 

Sourced from: popularmechanics.com

Featured Image Source: Photo by Christine und Hagen Graf/ CC by

Posted on February 15, 2023