Consumer Reports (CR) has tested Tesla’s Autopilot software and it’s not good news for the electric vehicle company.
In the wake of the fatal crash in Spring, Texas, in which it was reported that no one was behind the wheel, CR wanted to see if that scenario was possible. Tesla’s Autopilot software is only supposed to work in certain conditions, conditions which include a driver in the driver’s seat.
Unfortunately, Jake Fisher, CR’s senior director of auto testing, was able to easily and repeatedly bypass Tesla’s safeguards. In multiple tests, Fisher engaged Autopilot, put a weighted chain on the steering wheel to simulate the weight of a hand, slid over into the passenger seat, and then accelerated the stopped Tesla using the steering wheel dial.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Fisher says. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
Fisher’s overall evaluation of Tesla’s Autopilot was equally damning, especially compared to what’s available from its competitors.
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” says Jake Fisher, CR’s senior director of auto testing, who conducted the experiment. “Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.”