Tesla Autopilot Faces Jury Scrutiny in Fatal 2019 Florida Crash Trial

A federal jury trial begins today in Miami, examining whether Tesla’s Autopilot system contributed to a deadly 2019 crash near Key Largo, Florida, that killed a 22-year-old woman and injured her companion. The lawsuit, filed by survivor Dillon Angulo and the family of Naibel Benavides Leon, accuses Tesla of exaggerating the capabilities of its driver-assistance technology, potentially encouraging driver over-reliance, reports NPR. This case highlights ongoing concerns about the safety of semi-autonomous features in electric vehicles.

Crash Details and Immediate Aftermath

George McGee drove his Tesla Model S through a T-intersection at over 50 miles per hour (about 80 kilometers per hour), ignoring a stop sign and colliding with a parked Chevrolet Tahoe. McGee had activated Autopilot, which handles steering, braking, and acceleration independently. Despite this, the system failed to prevent the impact.

“I was driving, I dropped my phone, and looked down,” McGee told police at the scene. “I ran the stop sign and hit the guy’s car.”

The collision propelled Benavides Leon’s body approximately 75 feet (23 meters) away, resulting in her death, while Angulo sustained severe injuries. McGee survived, and dashcam footage from the Model S captured the moment. Plaintiffs have settled with McGee, shifting focus to Tesla’s role.

Plaintiffs’ Accusations of Overpromising Technology

Lawyers for Angulo and the Benavides family argue that Tesla’s marketing misled drivers about Autopilot’s reliability.

“Tesla advertised Autopilot in a way that greatly exaggerated its capabilities and hid its deficiencies,” they state in court filings, “encouraging Tesla drivers to over-rely on its Autopilot system.”

They point to a 2016 Tesla video showing a car appearing to drive itself, with a caption reading:

“The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”

A senior Tesla engineer later admitted in a separate lawsuit that the video was staged and did not reflect actual capabilities.

This development raises questions about transparency in EV driver-assistance systems. Plaintiffs also claim Tesla withheld crash data from the Model S, telling the judge last year:

“Tesla’s had this data all along, and they have engaged in a scheme to hide it from us.” U.S. District Judge Beth Bloom allowed the case to proceed, noting that “a reasonable jury could find that Tesla acted in reckless disregard of human life for the sake of developing their product and maximizing profit.”

Tesla’s Defense and Safety Claims

Tesla maintains that its systems enhance safety when used properly, requiring constant driver attention. The company’s website states

“Autopilot and Full Self-Driving (Supervised) are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”

In a 2023 post on X, Tesla emphasized McGee’s acknowledgment of responsibility and defended Autopilot:

“The data strongly indicates our customers are far safer by having the choice to decide when it is appropriate to engage Autopilot features. When used properly, it provides safety benefits on all road classes.”

CEO Elon Musk echoed this at a June shareholder meeting, saying: “Human driving is not perfect,” and noting roughly 40,000 annual U.S. roadway deaths.

“What matters is, like, are we making that number smaller? And as long as we’re making that number smaller, we’re doing the right thing,” Musk said.

Tesla denies withholding data or misleading consumers, with a lawyer telling the court that the company was not deliberately concealing information.

Implications for EV Driver-Assistance Regulations

Federal regulators have launched multiple investigations into Tesla’s Autopilot and Full Self-Driving (Supervised) systems amid a series of lawsuits questioning their safety. Few cases reach trial, as Tesla often settles privately. This Miami proceeding, expected to last three weeks, could set precedents for how courts view manufacturer liability in semi-autonomous EV crashes.

Building on that, the trial underscores real-world implications for EV owners, including the need for vigilance despite advanced features. It also spotlights industry trends toward more sophisticated autonomy, where overpromising could erode trust and invite stricter oversight. As electric vehicles proliferate, such cases may influence policy changes, pushing for clearer guidelines on marketing and data sharing to prioritize user safety.

Photo courtesy of Florida Highway Patrol.


Saiba mais sobre o EVXL.co

Assine para receber nossas notícias mais recentes por e-mail.

Copyright © EVXL.co 2025. All rights reserved. The content, images, and intellectual property on this website are protected by copyright law. Reproduction or distribution of any material without prior written permission from EVXL.co is strictly prohibited. For permissions and inquiries, please Entre em contato conosco first. Also, be sure to check out EVXL's sister site, DroneXL.co, for all the latest news on drones and the drone industry.

FTC: EVXL.co is an Amazon Associate and uses affiliate links that can generate income from qualifying purchases. We do not sell, share, rent out, or spam your email.

Haye Kesteloo
Haye Kesteloo

Haye Kesteloo é editora-chefe e fundadora do EVXL.coonde ele cobre todas as notícias relacionadas a veículos elétricos, abrangendo marcas como Tesla, Ford, GM, BMW, Nissan e outras. Ele desempenha uma função semelhante no site de notícias sobre drones DroneXL.co. Haye pode ser contatado em haye @ evxl.co ou @hayekesteloo.

Artigos: 1420

Deixe uma resposta