We reported about a similar issue not long ago, but a recent update of the Tesla Autopilot tech has apparently only worsened the problem.
The issue occurs when the car’s Autopilot (partial self-driving) feature detects a non-existent obstruction on the road or gets a false positive signal about an impending crash. The car then applies the emergency brake, potentially causing those behind it to rear-end it.
Such cases had been known before, but they were isolated and were never recorded to have any major negative consequences. The latest update, on the other hand, caused a surge of driver complaints and led Tesla to officially acknowledge the problem, citing non-specific ‘software issues.’
Another surge was in May this year, when Tesla switched from using advanced radars to camera-based vision systems, but the current disruption seems unprecedented.
Elon Musk also admitted that there was a flaw in the automatic emergency braking system of multiple Tesla cars and promised to investigate the issue.