Thursday, 25 February 2016 01:32

Consumer Watchdog: “Why Self-Driving Cars Fail” — Google Counters: “Danger is Drivers Taking Control”

Consumer Watchdog has strongly endorsed the California Department of Motor Vehicles’ draft regulations for autonomous vehicles that require self-driving robot cars have a licensed driver behind the wheel capable of taking control, and a working steering wheel, gas pedal and brake, however federal regulators appear to be much less conservative on the policy. 

DOWNLOAD THE PDF

 

“The need to require a licensed driver behind the wheel is obvious after a review of the results from seven companies that have been testing since September 2014: Robot cars are still not capable of dealing reliably with real-life situations,” said John M. Simpson, Consumer Watchdog’s Privacy Project director.

Under the autonomous car testing regulations, the companies were required to file “disengagement reports” explaining when a human test driver had to take control. The reports show that the cars are not always capable of “seeing” pedestrians and cyclists, traffic lights, low-hanging branches, or the proximity of parked cars, suggesting too great a risk of serious accidents involving pedestrians and other cars. The cars also are not capable of reacting to reckless behavior quickly enough to avoid the consequences, the reports showed.

“The companies’ own evidence makes clear that a human driver able to take control of the vehicle is necessary to ensure the safety of both robot vehicles and other Californians on the road,” Simpson said at a DMV workshop on autonomous vehicle regulations.

“Google, which logged 424,331 “self-driving” miles over the 15-month reporting period, said a human driver had to take over 341 times, an average of 22.7 times a month. The robot car technology failed 272 times and ceded control to the human driver; the driver felt compelled to intervene and take control 69 times,” Simpson said.

Other testing companies, driving far fewer autonomous miles than Google, also reported substantial numbers of disengagements. Bosch had 625 disengagements with 934.4 miles driven. Nissan with 1,485 miles driven had 106. Mercedes-Benz reported 1,031 with 1,738 miles driven. Delphi reported 405 disengagements with 16,662 miles. Volkswagen with 10,416 miles reported 260. Tesla claimed it had none, but did not say how many miles its drove.

“It’s important to understand that these “disengagements” were promoted by real situations that drivers routinely encounter on the road. Among reasons cited by Bosch were failures to detect traffic lights and heavy pedestrian traffic.

“Google’s robot technology quit 13 times because it couldn’t handle the weather conditions. Twenty-three times the driver took control because of reckless behavior by another driver, cyclist or pedestrian. The report said the robot car technology disengaged for a ‘perception discrepancy’ 119 times. Google defines such a discrepancy as occurring when the car’s sensors don’t correctly perceive an object, for instance over-hanging branches. The robot technology was disengaged 55 times for ‘an unwanted maneuver of the vehicle.’ An example would be coming too close to a parked car. The human took over from Google’s robot car three times because of road construction.

Google Says Danger is Letting Drivers Take Control

Google told NHTSA that the real danger is having auto safety features that could tempt humans to try to take control.

Google “expresses concern that providing human occupants of the vehicle with mechanisms to control things like steering, acceleration, braking... could be detrimental to safety because the human occupants could attempt to override the (self-driving system’s) decisions,” the NHTSA letter stated.

Read 728 times