Tesla sued by deceased driver’s family over ‘fraudulent misrepresentation’ of Autopilot safety

Tesla sued by deceased driver’s family over ‘fraudulent misrepresentation’ of Autopilot safety


Tesla Supercharger stations are seen in a parking lot in Austin, Texas, on Sept. 16, 2024.

Brandon Bell | Getty Images

Tesla is being sued by the family of a driver who died in a 2023 collision, claiming that the company’s “fraudulent misrepresentation” of its Autopilot technology was to blame.

The Tesla driver, Genesis Giovanni Mendoza-Martinez, died in the crash involving a Model S sedan in Walnut Creek, California. His brother, Caleb, who had been a passenger at the time, was seriously injured.

The Mendoza family sued Tesla in October in Contra Costa County, but in recent days Tesla had the case moved from state court to federal court in California’s Northern District. The Independent first reported on the venue change. Plaintiffs generally face a higher burden of proof in federal court for fraud claims.

The incident involved a 2021 Model S, which smashed into a parked fire truck while the driver was using Tesla’s Autopilot, a partially automated driving system.

Mendoza’s attorneys alleged that Tesla and Musk have exaggerated or made false claims about the Autopilot system for years in order to, “generate excitement about the company’s vehicles and thereby improve its financial condition.” They pointed to tweets, company blog posts, and remarks on earnings calls and in press interviews.

In their response, Tesla attorneys said the driver’s “own negligent acts and/or omissions” were to blame for the collision, and that “reliance on any representation made by Tesla, if any, was not a substantial factor” in causing harm to the driver or passenger. They claim Tesla’s cars and systems have a “reasonably safe design,” in compliance with state and federal laws.

Tesla didn’t respond to requests for comment about the case. Brett Schreiber, an attorney representing the Mendoza family, declined to make his clients available for an interview.

There are at least 15 other active cases focused on similar claims involving Tesla incidents where Autopilot or its FSD — Full Self-Driving (Supervised) — had been in use just before a fatal or injurious crash. Three of those have been moved to federal courts. FSD is the premium version of Tesla’s partially automated driving system. While Autopilot comes as a standard option in all new Tesla vehicles, owners pay an up-front premium, or subscribe monthly to use FSD.

Elon Musk unveils the Cybercab at Tesla robotaxi event

The crash at the center of the Mendoza-Martinez lawsuit has also been part of a broader Tesla Autopilot investigation by the National Highway Traffic Safety Administration, initiated in August 2021. During the course of that investigation, Tesla made changes to its systems, including with a myriad of over-the-air software updates.

The agency has opened a second probe, which is ongoing, evaluating whether Tesla’s “recall remedy” to resolve issues with the behavior of Autopilot around stationary first responder vehicles had been effective.

NHTSA has warned Tesla that its social media posts may mislead drivers into thinking its cars are robotaxis. Additionally, the California Department of Motor Vehicles has sued Tesla, alleging its Autopilot and FSD claims amounted to false advertising.

Tesla is currently rolling out a new version of FSD to customers. Over the weekend, Musk instructed his 206.5 million-plus followers on X to “Demonstrate Tesla self-driving to a friend tomorrow,” adding that, “It feels like magic.”

Musk has been promising investors that Tesla’s cars would soon be able to drive autonomously, without a human at the wheel, since about 2014. While the company has shown off a design concept for an autonomous two-seater called the CyberCab, Tesla has yet to produce a robotaxi.

Meanwhile, competitors including WeRide and Pony.ai in China, and Alphabet’s Waymo in the U.S. are already operating commercial robotaxi fleets and services.

WATCH: Tesla FSD tests were ‘incredibly good’

Tesla's FSD tests were 'incredibly good' and we're optimistic on the growth potential: BofA's Murphy



Source

AI chatbot firms face stricter regulation in online safety laws protecting children in the UK
Technology

AI chatbot firms face stricter regulation in online safety laws protecting children in the UK

Preteen girl at desk solving homework with AI chatbot. Phynart Studio | E+ | Getty Images The UK government is closing a “loophole” in new online safety legislation that will make AI chatbots subject to its requirement to combat illegal material or face fines or even being blocked. After the country’s government staunchly criticized Elon […]

Read More
ByteDance says it will add safeguards to Seedance 2.0 following Hollywood backlash
Technology

ByteDance says it will add safeguards to Seedance 2.0 following Hollywood backlash

Signage at a ByteDance offices in Beijing, China, on June 30, 2023.  Bloomberg | Bloomberg | Getty Images Chinese tech giant ByteDance has said it will strengthen safeguards on a new artificial intelligence video-making tool, following complaints of copyright theft from entertainment giants.  The tool, Seedance 2.0, enables users to create realistic videos based on […]

Read More
Much ado about nothing? TikTok’s U.S. usership steadies after turbulent start
Technology

Much ado about nothing? TikTok’s U.S. usership steadies after turbulent start

The TikTok Inc. sign in front of the building on Tuesday, Jan. 27, 2026 in Culver City, CA. Kayla Bartkowski | Los Angeles Times | Getty Images TikTok’s U.S. joint venture seems to have survived a turbulent rollout with minimal change in usership, as early narratives of a mass user exodus prompted by service outages […]

Read More