Elon Musk and Tesla are facing scrutiny once again as the family of a 31-year-old man killed in a crash involving the company’s self-driving “autopilot” technology has filed a lawsuit.
Genesis Giovanni Mendoza Martinez tragically lost his life on February 18, 2023, after his Tesla – which was operating on autopilot – collided with a firetruck at high speed.
Elon Musk has been named in the lawsuit. Credit: Jared Siskin / Getty
Mendoza, who had reportedly trusted the car’s self-driving feature after being influenced by Tesla’s marketing, was fatally crushed in the accident.
His parents, Eduardo and Maria, and his brother Caleb, who was also injured in the crash, are now holding Tesla and the company’s CEO Musk accountable.
Family and Attorney Speak Out
Attorney Brett Schreiber, representing the Mendoza family, described the incident as “entirely preventable.” He blasted Tesla’s autopilot feature as “ill-equipped to perform” and accused the company of treating public roads as a testing ground for its autonomous driving technology.
“This is yet another example of Tesla using our public roadways to perform research and development of its autonomous driving technology,” Schreiber told The Independent. “What’s worse is that Tesla knows that many of its earlier model vehicles continue to drive our roadways today with this same defect, putting first responders and the public at risk.”
The lawsuit alleges that Mendoza – like “many members of the public” – was persuaded to buy the car due to Tesla’s extensive marketing of the self-driving feature, which was positioned as safer than a human driver. Schreiber said Mendoza believed Musk’s claims and trusted the vehicle to navigate highways autonomously.
“Not only was he aware that the technology itself was called ‘Autopilot,’ he saw, heard, and/or read many of Tesla or Musk’s deceptive claims on Twitter [now X], Tesla’s official blog, or in the news media,” the complaint states.
“Giovanni believed those claims were true, and thus believed the ‘Autopilot’ feature with the ‘full self driving’ upgrade was safer than a human driver, and could be trusted to safely navigate public highways autonomously.”
Tesla Defends Its Technology
Tesla, however, has denied liability, arguing that its vehicles have a “reasonably safe design” under applicable state laws.
The company suggested that the accident may have been partially caused by Mendoza’s “own negligent acts and/or omissions.” In a court filing, Tesla stated that “no additional warnings would have, or could have, prevented the alleged incident.”
Details of the Crash
According to the lawsuit, the Tesla had been operating in autopilot mode for 12 minutes before the crash, traveling at an average speed of 71 mph. The collision with the firetruck not only resulted in Mendoza’s death but also left four firefighters with minor injuries.
This isn’t an isolated case. Between 2015 and 2022, Tesla customers reported 1,000 crashes and over 1,500 complaints of sudden, unintentional braking while using the autopilot feature, per the LA Times.
What Is Tesla’s ‘Autopilot’ Function?
On the official Tesla website, the company describes its vehicle’s ‘autopilot’ function as “an advanced driver assistance system that enhances safety and convenience behind the wheel. When used properly, Autopilot reduces your overall workload as a driver.”
It adds: “Autopilot, Enhanced Autopilot, and Full Self-Driving capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”
Federal Pushback Against Tesla
The incident has reignited criticism from government officials, including Transportation Secretary Pete Buttigieg, who has consistently expressed opposition to Tesla’s autopilot technology.
Buttigieg previously told AP last year: “I don’t think that something should be called, for example, an Autopilot, when the fine print says you need to have your hands on the wheel and eyes on the road at all times.”
In fact, back in February – less than two weeks before Mendoza’s crash – Buttigieg tweeted: “Reminder—ALL advanced driver assistance systems available today require the human driver to be in control and fully engaged in the driving task at all times.”
Credit: X
The Mendoza family and their attorney are calling for greater accountability, stating that this tragedy could have been avoided if Tesla had taken more precautions.
“This loss was entirely preventable,” Schreiber said. “Tesla needs to answer for its recklessness.”