Last month, Tesla recalled more than two million cars to implement stricter driver monitoring following an NHTSA investigation. It determined Tesla was too lax about keeping drivers engaged, and that this had caused several collisions with stationary emergency vehicles. But Tesla drivers can continue to misuse Autopilot, as cheap aftermarket gadgets can still fool the car into thinking drivers are engaged. Worse still, the way they work suggests Tesla ships software with lower safety standards in its North American cars.
Driver monitoring is a crucial safety measure in cars with automated driving tech. It ensures that the responsible party—the driver—doesn’t check out, and leave the car to handle scenarios it’s not capable of. In Teslas, it’s particularly important due to the company’s advertising, which has fostered an overconfidence in Tesla’s autonomous driving tech that has allegedly led to multiple deaths. Tesla has even warned that its flagship driving software, Full Self-Driving Beta, “may do the wrong thing at the worst time.”
Yet Tesla owners have been led to believe that driver monitoring is superfluous, in large part by its 2016 self-driving demo. In a video that employees say was staged, Tesla claimed that “the person in the driver’s seat is only there for legal reasons” as the car navigated to a destination. Many owners took the ad at face value however, describing Autopilot’s reminders to keep their hands on the wheel as “nagging” and trying to circumvent them. In the past, that has been done with everything from water bottles wedged in the steering wheel to weighted rings hung from it. Now, Tesla owners have resorted to fooling the software with aftermarket gadgets.
For at least a year now, various online retailers have sold plug-and-play hardware that partially deactivates the driver monitoring system. One such seller, Teslaunch, which sells a “Nag Elimination Module” for $139, explains how this works.
Apparently, some Teslas treat audio volume adjustments using the controls on the steering wheel as proof that the driver’s hands are on the wheel. At random five- to ten-second intervals, the module sends a signal to adjust the volume, tricking the monitoring system into thinking the driver’s hands are in place. The system isn’t totally overridden, as many Teslas use cameras to monitor where the driver is looking, but it’s still stripping back a layer of safety redundancy.
Curiously, this trick doesn’t work on all Teslas. Reddit posters from outside the United States and Canada say their cars don’t register volume adjustments as steering inputs. That would mean Tesla has shipped (and may still be selling) cars with more easily circumvented safety software to customers in North America. Put another way, it’s selling Americans cars with lower safety standards.
Tesla benefits from this because customers who believe Autopilot or Full Self-Driving Beta’s shortcomings are due to legal limits—and not these technologies’ limited capabilities—will remain loyal. CEO Elon Musk says Tesla’s value is entirely reliant on being a technological leader in autonomous driving (or at least, seen as one), so it’s in Tesla’s interest to allow this continued misuse. And that’s why the NHTSA had to step in.
But now that the NHTSA is awakening to the problems with Tesla’s so-called self-driving tech, enforcement against this loophole is possible. And if the bubble ever bursts on Tesla’s autonomy charade, one imagines it’d be only the first of many dominoes to fall.
Got a tip or question for the author? You can reach them here: james@thedrive.com