A recent YouTube review dedicated to testing the “full self-driving” capabilities of Tesla cars suggests that they may be even less safe than we feared – at least, when you neglect the rules and get your hands off the wheel.
The test looks extremely simple: the car moved in a straight line, in good weather with good visibility, with a realistic-looking mannequin of a child just standing in the middle of the test track. No sudden appearances, no crazy accelerations, nothing like that.
But the trouble is, the FSD software no longer relies on expensive radar tech to detect obstacles ahead. Instead, a camera mounted on the windshield generates a video feed that the on-board computer then compares to its neural network database of objects to decide whether the brakes need to be applied. Once again, there is no arguing that the child depiction is lifelike and warrants stepping hard on the brake – but in all three test runs, the Tesla ended up running over the mannequin like it was an empty spot.
Dan O’Dowd, the author of the video, is a long-standing and vehement critic of Elon Musk’s electric car company who argues that these cars are too dangerous to sell them on misleading premises. According to him, around 100,000 drivers trust the FSD software every day simply because it is called ‘Autopilot’, putting the lives of people around them at risk.