Police in California now say that Autopilot was engaged during the fatal crash of a Tesla Model 3 last week, according to ABC News. The driver of the Tesla died as a result of injuries sustained in the crash, with two other motorists seriously injured and taken to hospital.
The incident occurred near Fontana, California, on the 210 Freeway at roughly 2:30 a.m. 35-year-old Steven Hendrickson was killed when his Tesla Model 3 struck a semi-truck that had overturned on the 210 freeway. The driver of the semi, along with a bystander who was had stopped to assist the stricken truck driver, were both injured in the crash and taken to hospital with “major” injuries, according to reports from CBS LA.
“While the CHP does not normally comment on ongoing investigations, the Department recognizes the high level of interest centered around crashes involving Tesla vehicles. We felt this information provides an opportunity to remind the public that driving is a complex task that requires a driver’s full attention,” the California Highway Patrol said. The crash is also now the subject of an investigation by the National Highway Traffic Safety Administration, the 29th time the agency has investigated an incident involving a Tesla.
The automaker has long had a combative relationship with the NHTSA. Tesla has made repeated requests for confidentiality in dealings with the federal body, adding long delays to investigation proceedings on multiple occasions.
Concerningly, posts made by a TikTok account by the name “shendrickson21” and discovered by journalist E.W. Niedermeyer may indicate that the Tesla driver had misconceptions about the true capabilities of the Autopilot system. One video shows the Model 3 in Autopilot mode on a freeway at speed, referring to it as the “Best car ever!”, while a second shows the car using Autopilot in heavy traffic, posted with the hashtags “#bored” and “#tired.”
Being a Level 2 driver assist system, drivers are expected to remain fully alert to take over from Autopilot at any time. The videos posted on TikTok, filmed with no hands on the steering wheel, indicate less than full attention was paid to the road with Autopilot was engaged.
Tesla’s Autopilot has repeatedly shown itself to have issues with stationary objects, leading to multiple collisions and fatalities over the past few years. A crash in Taiwan on June 1 last year closely mirrors the recent events in California, with a Model 3 failing to detect an overturned truck, careening into the stricken vehicle at a huge rate of speed. In the incident, thankfully nobody was seriously injured. However, parallels drawn between the two crashes suggest little has been done to improve the software’s poor record in such situations.
The crash also comes in the wake of the arrest of Param Sharma, a Tesla driver taken into custody this week for repeatedly riding in the back seat of his Tesla with Autopilot activated and posting videos about it on social media. The 25-year-old, who claims to be “rich as f**k”, grew in notoriety when he purchased another Tesla immediately after being released from jail, riding in the backseat of a Model 3 and posting another enraging video.
The unwavering line from Tesla has been that it expects drivers to remain fully focused on the driving task, even with Autopilot engaged. Years of evidence shows that either by human fallibility, or a chilling disregard for the safety of others, it’s an unrealistic expectation to make. With crashes and fatalities racking up around the world, both of Tesla drivers and innocent bystanders, it raises the question of just how many people have to die before authorities will step in.
Got a tip? Let us know: tips@thedrive.com