Google Called Upon To Make Self-Driving Car Accident Reports Public

Google’s self-driving cars are coming under fire again as the Associated Press reported that four out of forty-eight of the cars Google has driving around California have gotten into accidents s...
Google Called Upon To Make Self-Driving Car Accident Reports Public
Written by Chris Crum
  • Google’s self-driving cars are coming under fire again as the Associated Press reported that four out of forty-eight of the cars Google has driving around California have gotten into accidents since September.

    As has been the case with previously reported accidents, however, none of these were actually the fault of Google’s cars, at least according to the company. According to the report, two accidents occurred when the cars were in control, and the others while humans were controlling them, but according to Google, none of them were actually at fault. Three of the vehicles were Lexus SUVs, and the fourth was a test vehicle of parts supplier Delphi Automotive, the report says, adding that a source claims all were minor accidents which took place at speed of less than 10 mph.

    Consumer Watchdog, a regular critic of many of Google’s endeavors, released a statement on Monday calling on the company to release reports of accidents involving its cars, and to commit to making any such reports public in the future.

    The organization’s privacy director John Simpson wrote a letter to Google CEO Larry Page and executive chairman Eric Schmidt, in which he said, “It is important that the public know what happened. You are testing driverless vehicles on public highways, quite possibly putting other drivers at risk.”

    You can read the full letter here (PDF).

    “Unbelievably Google is planning to offer its robot cars without a steering wheel, brake pedal or accelerator so there would be no way for a person to take control in an emergency,” said Simpson in the statement. “That plan underscores the need for the public to know the full details of all accidents.”

    Google unveiled its “first real build” of its self-driving vehicle prototype in December.

    Simpson’s letter to Page and Schmidt concluded: “Google has engaged in a highly visible public relations campaign extolling the supposed virtues of driverless cars. It is incumbent upon you to be candid about the cars’ failings and shortcomings as well. Your stated mission is ‘to organize the world’s information and make it universally accessible.’ Sadly, in practice, you’ve modified this to be ‘to organize the world’s information and make it universally accessible – except when it is about Google.’ Please treat yourselves as you would treat everyone else. Release DMV driverless car accident reports and details of your driverless car accidents. Make the autonomous technology disengagement reports public as well.”

    I have to say, I haven’t always agreed with all of Consumer Watchdog’s criticisms of Google, but they make some pretty fair points on this one. This isn’t the only area where Google is criticized over a lack of transparency, but it’s quite possibly one of the most important areas for Google to be transparent in.

    Chris Urmson, the director of Google’s self-driving car program, wrote a post on Medium about how after a million miles, Google hasn’t caused an accident.

    The post is interesting and continues Google’s history of talking about how much safer its cars are than human drivers. It gives various examples of people being stupid drivers, as if anyone needs proof of that. I don’t think anyone is arguing that people aren’t bad at driving.

    The post also talks about the accidents Google’s cars have had. It says:

    Over the 6 years since we started the project, we’ve been involved in 11 minor accidents (light damage, no injuries) during those 1.7 million miles of autonomous and manual driving with our safety drivers behind the wheel, and not once was the self-driving car the cause of the accident.

    Rear-end crashes are the most frequent accidents in America, and often there’s little the driver in front can do to avoid getting hit; we’ve been hit from behind seven times, mainly at traffic lights but also on the freeway. We’ve also been side-swiped a couple of times and hit by a car rolling through a stop sign. And as you might expect, we see more accidents per mile driven on city streets than on freeways; we were hit 8 times in many fewer miles of city driving. All the crazy experiences we’ve had on the road have been really valuable for our project. We have a detailed review process and try to learn something from each incident, even if it hasn’t been our fault.

    Not only are we developing a good understanding of minor accident rates on suburban streets, we’ve also identified patterns of driver behavior (lane-drifting, red-light running) that are leading indicators of significant collisions. Those behaviors don’t ever show up in official statistics, but they create dangerous situations for everyone around them.

    Self-driving cars may very well be much safer than human-driven vehicles. Frankly, I have no doubt about that. I work in a city that was listed in a BBC News top ten list about terrible traffic cities around the world. I see terrible driving every day of my life.

    Elon Musk even thinks human driving could eventually be outlawed.

    Still, I don’t think it’s asking too much for Google to up the transparency level about its accidents. Why shouldn’t we know more about what is happening when these vehicles are involved?

    Image via Google

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit