Every time a Tesla collides with anything, the first question asked is, “was the driver using Autopilot?” The controversial driving assist feature has played both the role of scapegoat and cause of crashes involving Tesla vehicles. As it turns out, Tesla spots falling Autopilot use in the wake of each media fiasco, according to a statement by Tesla CEO Elon Musk in the company’s first quarter conference call, as transcribed by Electrek.
“Of course, when there’s negative news in the press, it dips,” stated Musk in the conference call. “This is not good because people are reading things in the press that causes them to use Autopilot less, and then that makes it dangerous for our customers, and that’s not cool.
“One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people—or some of the articles—for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.
“When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it.”
Musk was also quoted as saying that use of the driving assist is climbing, despite the Tesla Autopilot crashes in the news. Two major Tesla crashes attributed to Autopilot use have already occurred this year, one of which was a crash into the back of a fire truck, the other a fatal crash with a highway divider, confirmed to be Autopilot-related.
Though no crash was involved, one U.K. man lost his license for misuse of Autopilot last month, as he was spotted lounging in the passenger seat, with the assist active. Is humanity yet mature enough to handle semi-autonomous systems like Autopilot? The above does not suggest so.