Tesla has received a special request from federal automotive safety regulators. The regulators are asking Tesla to provide extensive information regarding its driver assistance and driver monitoring systems. This request also includes details about a previously undisclosed setting known as “Elon mode.”

Normally, when a Tesla driver engages the company’s driver assistance systems, such as Autopilot, Full Self-Driving (FSD), or FSD Beta, a visual symbol appears on the car’s touchscreen to remind the driver to hold the steering wheel. If the driver neglects to do so for an extended period, the system escalates the reminder to a beeping sound.

With the “Elon mode” configuration enabled, however, Tesla allows drivers to use Autopilot, FSD, or FSD Beta without the need for the aforementioned reminders.

The “Elon Mode” Investigation

On July 26, the National Highway Traffic Safety Administration (NHTSA) sent a letter and special order to Tesla, requiring the company to spill the beans about this unique configuration. Among other things, the NHTSA is requiring Tesla to disclose the number of cars and drivers using “Elon mode.” The agency published the document on its website on Tuesday.

In the letter and special order, John Donaldson, the acting chief counsel of the NHTSA, expressed concerns about the safety implications of recent modifications to Tesla’s driver monitoring system. This concern arises from available information suggesting that vehicle owners may be able to modify Autopilot’s driver monitoring settings. This enables drivers to operate the vehicle in Autopilot mode without being prompted to apply torque to the steering wheel.

Tesla was given a deadline of August 25th to provide all the requested information to the agency. Tesla responded on time. However, Tesla’s response has been granted confidential treatment by NHTSA.

According to Philip Koopman, an automotive safety researcher and associate professor of computer engineering at Carnegie Mellon University, NHTSA views cheat codes that can disable safety features like driver monitoring in a negative light.

Koopman agrees with the NHTSA. “Hidden features that degrade safety have no place in production software,” Koopman said in a recent interview with CNBC.

Ongoing Investigations

The NHTSA is still conducting investigations into several crashes where Tesla Autopilot systems may have played a role. These crashes include “fatal truck under-run crashes” and crashes in which Tesla vehicles hit stationary first responder vehicles. NHTSA acting administrator Ann Carlson has indicated in recent interviews that a conclusion is imminent.

For years, Tesla has informed regulators that their driver assistance systems, including FSD Beta, are only considered “level 2” and do not make their cars fully autonomous.

However, Tesla’s branding of its driver assistance systems has confused customers. Tesla CEO Elon Musk, who also owns and operates the social network X (formerly Twitter), often implies that Tesla vehicles are self-driving.

Elon Musk’s Problematic FSD Test Drive 

During the weekend, Musk live-streamed a test drive in a Tesla equipped with a still-in-development version of the company’s FSD software (version 12) on a social platform. Throughout the demo, Musk streamed using a mobile device while driving and conversing with his passenger, Tesla’s head of Autopilot software engineering Ashok Elluswamy.

In the video stream, which was somewhat blurry, Musk did not show all the details of his touchscreen or demonstrate that he had his hands on the steering yoke, ready to take over the driving task at any moment. There were clear instances where he had no hands on the yoke.

According to Greg Lindsay, an Urban Tech fellow at Cornell, Musk’s use of Tesla’s systems likely violates the company’s own terms of use for Autopilot, FSD, and FSD Beta. The incident is likely to cause the NHTSA to come after Tesla even more.

Tesla’s website advises drivers to remain alert, keep their hands on the steering wheel, and maintain control of their vehicle when using Autopilot, Enhanced Autopilot, and Full Self-Driving.

Grep VC managing partner Bruno Bowden, an expert in machine learning and an investor in autonomous vehicle startup Wayve, commented that the demonstration showcased some improvements in Tesla’s technology. However, he believes that Tesla still has a long way to go before it can provide a safe self-driving system.

Bowden also noted that during the drive, the Tesla system almost ran a red light, but Musk intervened and braked in time to prevent any danger.

Image Source: Pablo999, https://shorturl.at/bEV19