Houston, We Have a Problem: Tesla’s Autopilot Software Head Does Not Know About the Operational Design Domain (ODD) Or Perception-Reaction Time!

Rohail Saleem
Tesla Autopilot FSD
Image Source: Dirty Tesla

This is not investment advice. The author has no position in any of the stocks mentioned. Wccftech.com has a disclosure and ethics policy.

There is a growing perception that the Full Self-Driving (FSD) capability of Tesla's Autopilot system leaves much to be desired if one were inclined to employ euphemisms here. After a string of troubling road accidents – the precursors to a number of ongoing FSD-related investigations in the US – and the fact that the "miles per disengagement rate on FSD beta is actually getting worse," as per a tabulation by Taylor Ogan, the CEO of Snow Bull Capital, clearly something is not working properly at Tesla. Meanwhile, the deposition of an Autopilot executive back in the summer of 2022 (and only just now made public) has opened a veritable can of worms for the EV giant.

To wit, Ashok Elluswamy, Tesla's Head of Autopilot Software, participated in a July 2022 deposition in relation to a fatal accident back in 2018. While most of the media has concentrated on the juicy tidbit that Tesla's 2016 FSD promotional video was staged à la Nikola style, the deposition's transcript also highlights the Tesla executive's very troubling knowledge gaps.

The Twitter account @MoodyHikmet has effectively summarized this aspect of the deposition. For one, when asked about Operational Design Domain (ODD) – the conditions under which an automated system operates – Mr. Elluswamy said:

"I've heard those words before, but I do not recall much more than that."

Tesla's Autopilot head also denied the recollection of ever seeing a document on ODD while working at the EV giant. The fact that Mr. Elluswamy appears totally unaware of such a fundamental aspect of any automated system is quite troubling. Of course, it is entirely possible that Tesla uses some other vernacular to lay out the Autopilot's ODD. However, his unfamiliarity with fundamental industry parlance is very intriguing. Yet, it gets worse.

The perception-reaction time describes the time period it takes for humans to perceive and then react to a specific stimulus – be it auditory or visual cues. However, Mr. Elluswamy denied the recollection of ever receiving any training on perception-reaction time! He also admitted to guessing "what those words mean."

Given this state of affairs at Tesla, it is hardly a surprise that the Autopilot is consistently underperforming the hype that Elon Musk so diligently generates. In a comical development, the New York Times recently reported on a meeting where its reporter tested out the Autopilot's FSD capability in the presence of some die-hard Tesla fans who repeatedly got stumped every time the system faltered.

Meanwhile, Tesla's ongoing "funding secured" trial is also opening a can of worms for the FSD capability of the Autopilot. While Gary Black of the Future Fund correctly notes that this lawsuit poses minimal ramifications for the company's financials, it does leave the proverbial avenue wide open for follow-up FSD-related lawsuits.

What do you think of Tesla's Autopilot strategy? Let us know your thoughts in the comments section below.

Share this story