Controversy Erupts Over Video of FSD Tesla Striking Child Mannequin

Tesla fans and foes are leaping to replicate the test with their own experiments—and the results are mixed.
Dan O'Dowd

Share

Tesla fans and foes are clashing over whether a test performed against the automaker’s Full Self-Driving Beta software apparently showing it run over a child-sized mannequin was carried out legitimately.

Video of the test was released earlier this week and shows a Tesla Model 3 repeatedly striking a small, stationary dummy directly in front of the car while supposedly operating on Tesla’s controversially named Full Self-Driving Beta software. However, clips from the video led some online publications to instead call the test a “smear campaign” under the notion that FSD was not actually engaged, and after further evidence emerged that FSD was engaged during the test, Tesla fans and FSD users began filming their own experiments to see what happened—with mixed results. One noted Tesla devotee even staged a public call for people to volunteer their own children to stand in front of his Tesla and prove it’ll stop in time.

So will a self-driving Tesla run over a child? Amid the noise, the answer seems to be a resounding “maybe,” which is just as bad as “yes” in this case. Here’s where things stand.

To understand why this test is so controversial, it’s important to start with the funding behind it.

The test, video of which is embedded above, was paid for and performed by The Dawn Project, an organization campaigning to promote “unhackable” software and systems. The Dawn Project is backed by Dan O’Dowd, who is the CEO, president, and founder of Green Hills Software, a competing company in the AV space. In January, O’Dowd took out a full-page ad in the New York Times campaigning to have Tesla’s FSD Beta banned from operating on U.S. roads. O’Dowd also ran in the Democratic U.S. Senate primary to represent California earlier this year, though he only garnered 1.1% of the vote. Tesla CEO Elon Musk publicly slammed Green Hills following the full-page ad, calling it “a pile of trash.”

O’Dowd’s test clearly shows the Model 3 slamming into the dummy three times, but its stitched-together footage showing the infotainment screen doesn’t match up with how the car would actually look when FSD Beta is engaged. There is no Autopilot icon, the prediction line remains gray, and the speed doesn’t match up with the reproducibility steps published by The Dawn Project. Electrek took this as evidence that FSD Beta was not engaged and published an article condemning the test.

Post Unavailable

Tesla fans began to reference this article as proof that the test was flawed. Even Elon Musk joined in on sharing the article, tweeting it at The Guardian while calling the test a “scam video.”

Later, raw footage was published that showed the view from inside the cabin, and it appeared to show a UI on the central screen that would indicate FSD was in fact engaged. Furthermore, Art Haynie—the driver who conducted the test on behalf of The Dawn Project—signed an affidavit claiming that FSD Beta was active at the time of the test.

Post Unavailable

Regardless, this discrepancy then caused some people to go into full-on defense mode and re-enact tests themselves in an attempt to disprove the findings published by The Dawn Project.

Some people began setting up their own stationary mannequins on residential streets. They attempted to recreate the test results and published videos showing the vehicle avoiding the mannequin without hitting it. Others were able to replicate the findings as their own vehicles slammed into their homemade dummies. And yes, there’s that guy on Twitter who asked for volunteers to have their children run in front of his own Tesla to prove it’ll stop in time.

With Tesla’s autonomy stack focusing on vision-only systems, other tech companies are touting the importance of modalities like LIDAR to be used in conjunction with vision-based systems. The Dawn Project’s test mimics a similar public demonstration organized by Luminar earlier this year. In Luminar’s test, a vehicle with its self-produced LIDAR tech was able to spot a child-sized mannequin crossing the road while a Tesla Model Y collided with it instead. That test also sparked debate in the AV community.

Whether or not the most recent test was carried out ethically is still being debated by both sides. Even a single failed test can prove to be problematic in the march of nines, and in an industry that is catching the eyes of regulators, every nine counts.

Got a tip or question for the author? Contact them directly: rob@thedrive.com