Remember back in 2016 when Elon Musk suggested that critics of self-driving cars were killing people? He may yet be right, but not until self-driving cars are both commercially available and demonstrably safer than humans. Between now and then we have a deeper problem, which is that the majority of self-driving media coverage is absolute nonsense. The stench of clickbait around self-driving is so thick, it’s almost impossible to find a story burdened with accurate information. Between Business Insider, the fools, and the shills grasping for the 40% of global automotive media traffic with “Tesla” in the headline, I thought the toilet bowl of self-driving coverage was full.
I was wrong. The latest offender? The New York Times’ absurd Tesla Model 3 review.
What is happening at the New York Times? One would expect the paper of record to have people with basic knowledge of their topic areas, if not expertise. Sadly, when it comes to automotive coverage, there’s no there there. They fired everyone with sector knowledge last year, including The Drive’s very own expert Lawrence Ulrich. That error became glaringly clear with the Times’ October 2017 outrage, “Driverless Cars Made Me Nervous. Then I Tried One“, in which the esteemed David Leonhardt contradicted the headline in the first sentence, calling the Volvo S90 “semi-driverless”, a term that exists nowhere except in the pages of publications too cheap to keep experts on the masthead.
A car is self-driving, or it isn’t.
A car is driverless, or it isn’t.
A car has semi-autonomous features, or it doesn’t.
That the widely recognized Society of Automotive Engineers (SAE) automation levels are vague and need revamping is no excuse; Leonhardt doesn’t even cite them. The headline and article are so misleading as to make the New York Times complicit in the very storm of self-driving disinformation they would seek to clear.
Musk might yet be half-right, but he was also half wrong. It’s not just the critics of self-driving that are killing people. It’s also the supporters. When the Times is foolish enough to conflate “driverless”, “self-driving”, “semi-autonomous” and “semi-driverless”, people who don’t know better might start believing Tesla Autopilot actually is an autonomous system.
Despite a flurry of Twitter criticism, the New York Times didn’t resolve its total lack of automotive expertise, it doubled down with the most irresponsible and inaccurate Tesla Autopilot article yet from a mainstream publication, “With Tesla in a Danger Zone, Can Model 3 Carry It to Safety?“
Let’s deconstruct the offending paragraphs:
What is author James B. Stewart trying to suggest? That “Autopilot” is a feature set that can be transferred from aviation to cars? Tesla Autopilot is a brand, not a technology, and as 793,000 news stories pointed out in a .54 second Google search, it isn’t the same thing as an aviation “autopilot” for the ground.
Huh? I searched the Tesla website and couldn’t find any reference to them claiming Autopilot is “an autonomous driving system.” It isn’t. It’s semi-autonomous, at best. That second sentence about full self-driving capabilities adds context, but not the good kind.
At least the author uses “semi-autonomous” this time, but then he makes a factual error. Gripping the wheel doesn’t disable Autopilot and transfer control back to the driver. Neither does laying your hand on it. Torque does. There is no capacitive sensor on the wheel that senses human touch. You have to apply approximately 1 Newton meter of torque to disable Autopilot.
Bad New York Times. Bad.
And then things go from uninformed to inaccurate.
If I knew nothing, I would assume that the author told the Tesla Model 3 where to go, it drove there, and then it parked, all without his intervention. He actually uses the phrase “without my intervention.” That sounds like a voice-activated self-driving car. A car that can do everything door-to-door. There is nothing in that paragraph that suggests the Tesla Model 3 is anything but a self-driving car.
Newsflash: there are no self-driving cars for sale today. Not one.
What the author describes is not what actually happened.
Here’s what actually happened:
- The author pressed a button on the steering wheel to activate Voice Command.
- The author said “Navigate to Garden State Plaza”.
- The navigation system listed the Garden State Plaza on the center screen.
- The author tapped on the correct destination on the center screen.
- The author began driving the car. Himself.
- Using a stalk left of the steering wheel, the author set the radar cruise control distance to 3 car lengths, unless it was already set at 3.
- When conditions permitted, the author tapped the right stalk down twice to engage Autopilot, which remained engaged for some length of time, one or more times.
- When the author approached a light/exit/intersection, he disengaged Autopilot by tapping on the brakes, applying force to the steering wheel, and/or using the same stalk.
- The author then manually drove into the parking lot.
- Upon finding a spot, the author stopped the car, then engaged the Automatic Parking functionality, which parked the car.
Take a good look at that list. Think about how many interventions were necessary to get from A to Z. That is not a fully self-driving car. That is a car in which a person has a lot of choices to make, and has to make, or the car isn’t going anywhere. Once in motion, if the driver doesn’t make some real decisions, it’s going into a ditch, or worse.
Does the author understand what he’s saying? The Tesla does not eliminate the danger of a blind spot. That would require sensors the Model 3 lacks, like a rear radar. Strangely, he points out that he’s had problems in other cars with blind spots, then says this:
So what’s his conclusion?
Tesla’s wireless over-the-air (OTA) updates are brilliant, but they can’t magically conjure up the necessary hardware to resolve the problem he describes.
The author isn’t familiar with the most basic concepts of automation or autonomous systems, or the difference between series and parallel semi-autonomy. #WouldYouLikeToKnowMore? Here’s an article about it from last year which explains the difference, and how misguided Stewart is when he claims Autopilot “enhances (rather than supplants) human performance.”
Virtually everyone, from car companies to Silicon Valley to the Department of Transportation and NHTSA, has failed to educate consumers about the realities of automation and autonomy, which are not the same thing. If safety is a desired goal of self-driving cars, it’s not being served by misinformation.
This is the legacy of cost cutting in newsrooms.
When one of the most respected papers in the country is publishing drivel of this level, what does that suggest about their other coverage?
Signed,
A lifelong New York Times fan and reader.
P.S. if you want to read a serious Model 3 review, read Tesla Model 3, The First Serious Review.
(CORRECTION: This article was corrected to reflect the Model 3 Autopilot engagement method, as opposed to the S.)
Alex Roy — Founder of the Human Driving Association, Editor at The Drive, Host of The Autonocast, co-host of /DRIVE on NBC Sports and author of The Driver — has set numerous endurance driving records, including the infamous Cannonball Run record. You can follow him on Facebook, Twitter and Instagram.