Tesla is currently facing multiple investigations related to its Autopilot/Full Self-Driving Beta software. The Wall Street Journal (WSJ) has obtained footage and computer logs from accidents involving emergency vehicles. One particular incident should prompt the entire self-driving industry to pause and reflect.
The case in question revolves around a man, reportedly impaired, who activated Autopilot while driving his 2019 Model X on a freeway in Montgomery County, Texas. The incident occurred on February 27, 2021. Subsequently, the Model X collided with a police car with its emergency lights on, positioned in the right lane. The crash resulted in injuries to six individuals, including five police officers.
The five officers are now taking legal action against Tesla, although the company argues that the impaired driver bears responsibility for the accident. However, even accounting for the impaired driver, the behavior of the Model X in this case is troubling.
According to the WSJ, the driver received 150 reminders in a span of 34 minutes to place his hands on the wheel. One alert appeared just seconds prior to the crash. Although the driver complied each time, no action was taken to avoid the obviously obstructed lane.
Allowing a driver 150 opportunities to act safely within a little over half an hour seems excessive. Moreover, there appears to be another, more serious flaw in the Autopilot system.
The 2019 Model X is equipped with radar and cameras. While the cameras excel at tracking moving vehicles, the radar falls short. As a result, the system relies on the cameras to compensate for this deficiency. Experts cited by WSJ explained that the flashing lights of emergency vehicles can confuse the cameras.
In this particular instance, Autopilot recognized an obstruction in the lane 2.5 seconds before impact, while traveling at a speed of 55 miles per hour. The system briefly attempted to slow down before completely disengaging moments before the collision.
Other car manufacturers, like Waymo and Cruise, have faced challenges with their self-driving technology when encountering emergency vehicles. However, they have not experienced any crashes or catastrophic incidents.
In the Montgomery County case, it seems highly likely that Tesla Autopilot failed, not just the driver.
Image Source: Rick Tasker, https://shorturl.at/bEV19