EVs

Tesla's Safety Software Let a Stuffed Animal Use Full Self-Driving

Not confidence-inspiring for software designed to help prevent potentially fatal accidents.

Updated: 
Originally Published: 
Teddy bear driving a Tesla EV
AI Addict via YouTube

Tesla’s Full Self-Driving beta has taken its fair share of heat. You’ve got federal probes, collisions with inanimate objects, loads of shoddy left turns, and now, arguably among the most damning folds in the FSD story... stuffed animals.

In a recent test from YouTuber, AI Addict, a video shows important safety software — the kind meant to prevent drivers from using FSD when they’re not paying attention — can be tricked by a host of inanimate objects, including balloons with faces and pretty obviously inhuman-looking stuffed animals.

Superficially, it’s a comical demonstration, but in principle, it’s a safety nightmare.

Tesla’s FSD Woes

For software billed as “Full Self-Driving,” Tesla’s semi-autonomous driving systems are anything but. In fact, the entire reason eye-monitoring cameras exist inside Tesla’s EVs is because drivers are required to be alert and ready to intervene at a moment’s notice while using Tesla’s FSD beta.

Tesla’s failsafe, however, might not be such a failsafe for anyone willing to engage in a little subterfuge.

To dupe Tesla’s camera, all that needs to be done is put a teddy bear in the driver’s seat. In AI Addict’s video, the YouTuber claims he was able to drive for a full 10 minutes on a closed road before eventually ending the test without being detected.

To hammer the point home AI Addict takes the test one step further and introduces a theoretical pedestrian — a child mannequin to be exact — and the results are pretty hard to watch. Time after time Tesla’s software mows the mannequin down with a stuffed animal at the wheel.

And believe it or not, things actually do get worse. AI Addict even goes as far as putting a very inanimate-looking balloon in the driver’s seat and still, the YouTuber’s poor mannequin gets flattened once again. Oof.

Dawn Project

AI Addict’s results are concerning, but it is important to note that the tests were conducted in partnership with the Dawn Project, an advocacy organization run by vehement Tesla critic, Dan O’Dowd — a tech billionaire who’s made a crusade out of toppling Tesla’s credibility.

Even with that caveat, however, it’s hard not to imagine a world where Tesla’s software is that easily duped. Elon Musk has long exaggerated Tesla’s ability to create fully autonomous systems, and when it comes to less dire metrics like range, the automaker has been less than forthcoming.

Ultimately, the decision to trust Tesla’s FSD beta is up to the driver, but if results like the ones above are to be believed then that may not be the case for long.

This article was originally published on

Related Tags