Your #1 source for auto industry news and content

Tesla’s Autopilot is connected to more serious accidents than previously thought

Multiple investigations concerning Tesla's crashes and other issues with its driver-assistance software have been initiated by NHTSA.

Newly released data from the National Highway Traffic Safety Administration has shed light on a concerning trend surrounding Tesla vehicles operating in Autopilot mode. A recent crash in North Carolina’s Halifax County is just one example among the staggering 736 incidents across the United States since 2019.

In March, a school bus displayed its stop sign and flashing red warning lights when Tillman Mitchell, 17, stepped off it as a Tesla Model Y approached it on North Carolina Highway 561. The car, allegedly in Autopilot mode, never slowed down. It hit Mitchell, and the teenager was thrown into the windshield, flew into the air, and landed face down in the road. Mitchell’s father heard the crash and rushed from his proch to find his son lying in the middle of the road. Dorothy Lynch, Mitchell’s aunt, said, “If it had been a smaller child, it would be dead.”

The NHTSA findings reveal a significantly higher number of self-driving accidents than previously known, raising urgent concerns about the safety and regulation of autonomous driving systems.

The number of such crashes has surged over the past four years, reflecting the hazards associated with the increasingly widespread use of Tesla’s futuristic driver-assistance technology.

The number of deaths and severe injuries associated with Autopilot has grown significantly. When authorities initially released a partial accounting of accidents in June 2022, they counted only three deaths definitively linked to the technology. The most recent data includes at least 17 fatal incidents, 11 since last May, and five serious injuries.

Mitchell survived the March accident but needed to be put on a ventilator due to his broken leg and fractured neck. He still has trouble walking and has memory issues. According to his great-aunt, the incident should remind him of the risks associated with the technology.

Elon Musk has claimed that vehicles using Tesla’s Autopilot mode are safer than those driven by human drivers, citing crash statistics when the driving modes are compared. He claims the technology will usher in a safer, almost accident-free future. However, the data reveals glaring inadequacies in the equipment being tested in real-time on America’s highways. Although, it is impossible to know how many crashes may have been avoided.

In recent years, both Autopilot and Full Self-Driving have drawn criticism. Autopilot is an inappropriate name, according to Transportation Secretary Pete Buttigieg, who stated, “The fine print says you need to have your hands on the wheel and eyes on the road at all times.”

Multiple investigations concerning Tesla’s crashes and other issues with its driver-assistance software have been initiated by NHTSA. One has concentrated on the phenomenon known as “phantom braking,” in which cars suddenly slow down in response to perceived hazards.

Stay up to date on exclusive content from CBT News by following us on Facebook, Twitter, Instagram and LinkedIn.

Don’t miss out! Subscribe to our free newsletter to receive all the latest news, insight and trends impacting the automotive industry.

CBT News is part of the JBF Business Media family.

Jaelyn Campbell
Jaelyn Campbell
Jaelyn Campbell is a staff writer/reporter for CBT News. She is a recent honors cum laude graduate with a BFA in Mass Media from Valdosta State University. Jaelyn is an enthusiastic creator with more than four years of experience in corporate communications, editing, broadcasting, and writing. Her articles in The Spectator, her hometown newspaper, changed how people perceive virtual reality. She connects her readers to the facts while providing them a voice to understand the challenges of being an entrepreneur in the digital world.

Related Articles

Latest Articles

From our Publishing Partners