Ashok Elluswamy’s July 2022 deposition in civil lawsuit related to fatal crash obtained by Reuters.
A 2016 video that Tesla used to promote its self-driving technology was staged to show capabilities like stopping at a red light and accelerating at a green light that the system did not have, according to testimony by a senior engineer.
The video, which remains archived on Tesla’s website, was released in October 2016 and promoted on Twitter by chief executive Elon Musk as evidence that Tesla drives itself.
But the Model X was not driving itself with technology Tesla had deployed, Ashok Elluswamy, director of autopilot software at Tesla, said in the transcript of a July deposition taken as evidence in a lawsuit against Tesla for a 2018 fatal crash involving a former Apple engineer.
The previously unreported testimony by Elluswamy represents the first time a Tesla employee has confirmed and detailed how the video was produced.
The video carries a tagline saying: “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”
Elluswamy said Tesla’s autopilot team set out to engineer and record a “demonstration of the system’s capabilities” at the request of Musk.
Elluswamy, Musk and Tesla did not respond to a request for comment. However, the company has warned drivers that they must keep their hands on the wheel and maintain control of their vehicles while using autopilot.
The Tesla technology is designed to assist with steering, braking, speed and lane changes but its features “do not make the vehicle autonomous,” the company says on its website.
To create the video, Tesla used 3D mapping on a predetermined route from a house in Menlo Park, Calif., to Tesla’s then-headquarters in Palo Alto, he said.
Drivers intervened to take control in test runs, he said. When trying to show the Model X could park itself with no driver, a test car crashed into a fence in Tesla’s parking lot, he said.
“The intent of the video was not to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system,” Elluswamy said, according to a transcript of his testimony seen by Reuters.
Justice Department probing after series of crashes
When Tesla released the video, Musk tweeted: “Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot.”
Tesla faces lawsuits and regulatory scrutiny over its driver assistance systems.
The U.S. Department of Justice began a criminal investigation into Tesla’s claims that its electric vehicles can drive themselves in 2021, after a number of crashes, some of them fatal, involving autopilot, Reuters has reported.
Authorities investigate deadly Tesla crash with nobody in driver’s seat
Two men died after their Tesla MODEL S crashed into a tree and burst into flames in The Woodlands, Texas, on April 17. One man was discovered in the passenger seat, and the other in the back, leading authorities to investigate whether the car was in the fully self-driving mode that Tesla has been promoting ahead of a wider release of its upgrade from semi-automated driving.
The New York Times reported in 2021 that Tesla engineers had created the 2016 video to promote autopilot without disclosing that the route had been mapped in advance or that a car had crashed in trying to complete the shoot, citing anonymous sources.
When asked if the 2016 video showed the performance of the Tesla autopilot system available in a production car at the time, Elluswamy said, “It does not.”
Elluswamy was deposed in a lawsuit against Tesla over a 2018 crash in Mountain View, Calif., that killed Apple engineer Walter Huang, 38.
Andrew McDevitt, the lawyer who represents Huang’s wife and who questioned Elluswamy in July, told Reuters it was “obviously misleading to feature that video without any disclaimer or asterisk.”
The National Transportation Safety Board concluded in 2020 that Huang’s fatal crash was likely caused by his distraction and the limitations of autopilot. It said Tesla’s “ineffective monitoring of driver engagement” had contributed to the crash.
Elluswamy said drivers could “fool the system,” making a Tesla system believe that they were paying attention based on feedback from the steering wheel when they were not. But he said he saw no safety issue with autopilot if drivers were paying attention.
Credit belongs to : www.cbc.ca