The National Highway Traffic Safety Administration is seeking information from Tesla about its newly released “Mad Max” Full Self-Driving mode after drivers reported on social media that vehicles using the aggressive setting routinely exceed posted speed limits. The inquiry comes as NHTSA investigates 2.9 million Tesla vehicles equipped with FSD following 58 reports of traffic safety violations, including 14 crashes that caused 23 injuries.
The timing couldn’t be worse for Tesla. Just one week after federal regulators opened a sweeping investigation into FSD’s tendency to run red lights and violate traffic laws, the company released a driving mode explicitly designed for higher speeds and more aggressive behavior—then promoted it on social media as perfect for drivers running late.
Mad Max Mode Exceeds Speed Limits By Design
Tesla reintroduced its “Mad Max” driving profile on October 16, 2025, with the FSD v14.1.2 software update. The mode promises “higher speeds and more frequent lane changes than HURRY,” according to Tesla’s release notes, which also state that “driver profile now has a stronger impact on behavior. The more assertive the profile, the higher the max speed.”
Within 24 hours of release, videos posted to social media showed Mad Max-equipped Teslas driving more than 15 mph over posted speed limits and rolling through stop signs. In one X Spaces event captured by Rolling Stone, a Tesla owner testing the new mode announced:
“It’s going 65 in a 35 right now. It sees the speed limit as 35, it’s highlighting it on the screen, because I’m going over the speed limit—because it’s going over the speed limit. It’s warning itself.” The driver added: “I’m not quite sure why it completely ignored the sign.”
Mad Max operates at speeds up to 85 mph on highways and makes aggressive lane changes to navigate through traffic. Tesla enthusiast Sawyer Merritt called it “unlike any FSD mode Tesla has ever released before” in a post on X. The company itself reposted social media content describing Mad Max as accelerating and weaving “through traffic at an incredible pace, all while still being super smooth. It drives your car like a sports car. If you are running late, this is the mode for you.”
Federal Investigation Already Underway
NHTSA’s inquiry into Mad Max comes amid a broader investigation that launched October 7, 2025, covering approximately 2.88 million Tesla vehicles equipped with FSD (Supervised) or FSD (Beta). The agency’s Office of Defects Investigation identified numerous incidents where FSD “induced vehicle behavior that violated traffic safety laws.”
The investigation focuses heavily on red light violations. NHTSA documented six Standing General Order reports where Tesla vehicles operating with FSD engaged “approached an intersection with a red traffic signal, continued to travel into the intersection against the red light and was subsequently involved in a crash with other motor vehicles in the intersection.” Four of these crashes resulted in one or more reported injuries.
The agency also received 18 complaints and one media report alleging that Tesla vehicles with FSD failed to remain stopped at red lights, failed to stop fully, or failed to accurately detect traffic signal states. Multiple incidents occurred at the same intersection in Joppa, Maryland, prompting coordination between NHTSA, the Maryland Transportation Authority, and State Police. Tesla has since taken action to address that specific intersection’s programming.
Beyond red light violations, NHTSA identified 18 complaints and two media reports of Tesla vehicles entering opposing lanes during turns, crossing double-yellow lines when proceeding straight, and attempting wrong-way turns despite clear signage.
NHTSA Demands Answers on Mad Max
“NHTSA is in contact with the manufacturer to gather additional information,” the agency said in a statement to The Guardian. “The human behind the wheel is fully responsible for driving the vehicle and complying with all traffic safety laws.”
The investigation remains in the preliminary evaluation stage—the first formal step before NHTSA could potentially seek a recall if regulators determine the system poses an unreasonable risk to safety. The agency will assess whether FSD provides adequate warning before executing unexpected maneuvers, whether drivers have sufficient time to intervene, and how well the system detects traffic signals, lane markings, and wrong-way signs.
Software entrepreneur Dan O’Dowd, founder of the Dawn Project and longtime Tesla FSD critic, told Rolling Stone that Mad Max poses danger to all motorists: “Rather than improving road safety, Tesla’s latest software update is putting more people at risk.”
Tesla’s Pattern of FSD Controversies
Tesla faces mounting legal and regulatory pressure over its driver-assistance systems. The Department of Justice is investigating whether Tesla misled customers about self-driving capabilities—despite the name “Full Self-Driving,” the system remains SAE Level 2, requiring constant driver attention and readiness to intervene.
A California state court certified a class action lawsuit in August 2025 against Tesla over FSD marketing claims spanning 2016 to 2024. The case allows thousands of Tesla owners to collectively pursue allegations that CEO Elon Musk and the company misled consumers about autonomous capabilities.
NHTSA also opened a separate investigation in October 2024 into 2.4 million Tesla vehicles equipped with FSD after four collisions in reduced visibility conditions, including a fatal 2023 crash where a pedestrian was killed. Another probe launched in January 2025 examines Tesla’s “Smart Summon” feature after reports of vehicles crashing into parked cars and posts in parking lots.
Tesla warns in its owner’s manual that FSD does not make vehicles autonomous. The system requires drivers to maintain attention and be prepared to take control at any moment. Yet the company’s marketing—including the “Full Self-Driving” name itself and promotional videos showing hands-free operation—has drawn criticism from regulators and safety advocates who say it encourages over-reliance on technology that cannot yet safely drive itself.
EVXL’s Take
This might be Tesla’s most tone-deaf move yet—and that’s saying something for a company we’ve watched face multiple NHTSA investigations under new chief Jonathan Morrison’s tenure.
Let’s connect the dots here. On October 7, federal regulators opened an investigation into nearly 3 million Teslas after documenting dozens of cases where FSD violated traffic laws, including running red lights and causing crashes with injuries. One week later, Tesla’s response was to release a mode called “Mad Max”—named after post-apocalyptic movies featuring deadly high-speed chases—that deliberately exceeds speed limits by 30 mph while the car warns itself about breaking the law.
The irony is spectacular. Just this past May, we covered how Tesla drivers were frustrated by FSD’s overly cautious safety nags, complaining the system acted like an “overprotective parent” for glancing at Spotify. Now Tesla swung to the opposite extreme, creating a mode that actively breaks traffic laws while federal investigators examine whether the company misled customers through a years-long pattern of overpromising autonomous capabilities.
This comes as Tesla battles regulatory heat on multiple fronts. The Smart Summon investigation we covered just yesterday involves vehicles that can’t detect parking lot posts. The July trial over a fatal 2019 Florida crash revealed Tesla staged marketing videos showing capabilities the system didn’t actually have. And let’s not forget the company’s long history of blaming drivers for known defects rather than acknowledging problems.
Here’s what Tesla seems to misunderstand: NHTSA’s investigation isn’t about whether FSD can drive aggressively—it’s about whether the system makes dangerous decisions that drivers can’t anticipate or override in time. Mad Max mode doesn’t solve that problem. It amplifies it by adding excessive speed to the mix, giving drivers even less reaction time when FSD inevitably makes a mistake at 85 mph instead of 35 mph.
The bigger question is whether regulators will finally force Tesla to reconcile its marketing with reality. You can’t call something “Full Self-Driving,” promote it with videos of hands-free operation, and then claim drivers are “fully responsible” when it crashes. That’s having your cake and eating it too—and NHTSA seems increasingly unwilling to let Tesla get away with it.
What do you think? Share your thoughts in the comments below.
Discover more from EVXL.co
Subscribe to get the latest posts sent to your email.
