SAN FRANCISCO, USA – Google on Monday, February 29, said that its self-driving car bore some of the blame in a recent fender-bender after making the kind of assumption a human might have made.
A Lexus car converted into an autonomous vehicle by the Internet company had a low-speed collision with a transit bus on February 14 in what marked the first time that Google laid some of the responsibility for a crash on the software brains.
“This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements,” Google said in a February monthly report about the performance of its self-driving cars.
“In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.”
A report filed with the California Department of Motor Vehicles contained details of the incident.
It said the Lexus was in Mountain View, where Google and its parent company Alphabet are based, with a test driver capable of taking control in position when the autonomous vehicle pulled toward a right-hand curb in anticipation of making a right turn.
The vehicle stopped after detecting sandbags near a storm drain in its path, then waited for a break in traffic to get around the obstruction, the report indicated.
After several vehicles passed, the self-driving car eased back into the center lane believing an approaching transit bus would stop, Google said. The bus did not stop.
“Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop,” Google said in its monthly report.
“And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time.”
The self-driving car was moving about two miles per hour when it collided with the side of the bus, which was traveling about 15 MPH, the accident report filed by Google said.
The accident was reviewed and software modified to “more deeply understand” that buses and other large vehicles are less likely to yield to the self-driving cars, according to Google.
But critics of the autonomous cars were not so forgiving.
“This accident is more proof that robot car technology is not ready for auto pilot and a human driver needs to be able to take over when something goes wrong,” Consumer Watchdog privacy project director John Simpson said in a release.
“The police should be called to the site of every robot car crash and all technical data and video associated with the accident must be made public.”
Google has previously disclosed accidents involving its self-driving cars, but maintained that they resulted from the actions of humans and not its technology. – Glenn Chapman, AFP/Rappler.com
There are no comments yet. Add your comment to start the conversation.