Injured? Dial #WIN or #946 from your cellphone for your free consultation or call 1-800-LAW-NEED.
Call Us 24/7
( 1-800-529-6333 )

Tesla Model S Autopilot Accident Wrongful Death Lawsuit Lawyers

September 07, 2021

Safety advocates and experts are very concerned because of the "unprecedented pileup of deadly crashes" concerning Tesla's Autopilot-enabled vehicles.  Montlick & Associates is offering a free case review to injury victims and families who tragically lost loved ones in fatal, self-driving car accidents. Call us nationwide 24 hours day/7 days a week at 1-800-LAW-NEED (1-800-529-6333), or use our website's Live Chat feature.

According to an online news report published on, Tesla is demonstrating something that other auto manufacturers would never attempt: Reckless Drivers + Autonomous Driving Technology = Fatalities.

Telsa and its CEO, Elon Musk, have transformed the electric vehicle, and radically changed the standard of the automobile industry. However, safety advocates believe Mr. Musk has raced ahead with self-driving vehicles in a reckless and fatal way by rushing the marketing of its self-driving feature called Autopilot.  

In the most recent fatal Tesla accident, a Model S car "flew" off the roadway close to Houston last April. The Tesla Model S, struck a tree, exploded, and was on fire for several hours. Emergency Fire Rescue crews discovered two bodies inside the Tesla Model S. One of the occupants was found in the passenger seat, and the other victim was discovered in the back seat. According to police officials, there was no one sitting in the driver's seat. It is believed that the Tesla's owner could have been "showing off" the car's Autopilot when the accident happened.

Elon Musk reported two days after the fatal accident, the car's data was recovered, and the information "so far" indicates that the car's Autopilot was not engaged. However, law enforcement crash investigators stated that a driver couldn't move from the vehicle's front seat into the back after the accident transpired. "Barring a suicide mission," Autopilot appears to be the only logical explanation. 

There have been approximately 11 deaths in 9 Autopilot accidents in the United States. The news report states that there have been another 9 deaths in 7 additional accidents internationally. Almost all auto manufacturers are producing similar self-driving technology, but there are no known fatalities connected to self-driving technology in other autonomous driving vehicles. The only fatal self-driving accident linked to another automobile brand was the tragic death in a 2018 crash involving a Volvo in an Uber test.

The National Transportation Safety Board (NHTSB) has accused Tesla's Autopilot of causing numerous fatal accidents and in collisions where no one was killed. The NTSB accident reports have a recurring theme: "Drivers over-relied on a self-driving system that in some cases was flawed and in others was simply not as capable as the driver thought."

For example, the NTSB provided the following statements in their Tesla accident reports:

In a 2016 Tesla Model S accident in Williston, Florida, that tragically killed the operator, the driver's Tesla Model S crashed under an 18-wheeler tractor-trailer truck. The NTSB stated, "The Tesla's automated vehicle control system was not designed to, and did not, identify the truck crossing the car's path or recognize the impending crash."

In a 2018 accident in Mountain View, California, a Tesla Model X struck a highway divider and killed the driver. The NTSB stated, "The probable cause of the crash was the Tesla Autopilot system steering the sport utility vehicle into a highway gore area due to system limitations, and the driver's lack of response due to distraction likely from a cell phone game application and overreliance on the Autopilot partial driving automation system."

In a fatal 2019 accident in Delray Beach, Florida, a Model 3 crashed underneath a tractor-trailer truck and tragically killed the driver. The NTSB said, "The Autopilot system did not send a visual or audible warning to the driver to put his hands back on the steering wheel. The collision avoidance systems did not warn or initiate [auto-emergency braking] due to the system's design limitations. The environment was outside the [operational design domain] of the Autopilot system, and Tesla does not limit Autopilot operation to the conditions for which it is designed."

On February 01, 2021, NTSB chairman Robert Sumwalt, wrote to the Department of Transportation reprimanding slack safety standards for self-driving vehicles. He cited Tesla. Regarding the 2019 Florida fatality, Mr. Sumwalt stated, "Tesla is testing on public roads a highly automated [autonomous vehicle] technology but with limited oversight or reporting requirements." He also warned that this "poses a potential risk to motorists and other road users." 

According to the news report, Tesla wrongly promotes its Autopilot technology as the "world's most advanced self-driving system." However, the news report states that this is not true. Currently, there are six levels of self-driving capabilities, from 0 to 5. Autopilot is a Level 2 autonomous driving system, which the Department of Transportation considers a "partial automation" since Level 2 requires the driver to "remain engaged with the driving task and monitor the environment at all times." According to Tesla, the auto manufacturer is rolling out what the company calls a "full-self driving subscription." The $10,000 software upgrade can be applied to most Tesla models. However, the self-driving technology falls well below Level 5 autonomy, or "full automation."

According to Jason Levine, the executive director for the Center for Auto Safety, says "Autopilot is an intentionally deceptive name being used for a set of features that are essentially an advanced cruise control system." Mr. Levine explained to Yahoo Finance that "Tesla's marketing is leading consumers to foreseeably misuse the technology in a dangerous way. Users are led to believe the vehicle can navigate any roadway. The rising body count suggests Autopilot is not, in fact, a replacement for a driver."

Shortly after the fatal April 17 accident, Elon Musk tweeted a web link to Tesla's safety data showing that motor vehicles operated with Autopilot are safer than vehicles without Autopilot. However, Mr. Musk cannot prove how safer its Autopilot would be if the safety protocols were stricter and Autopilot was promoted less aggressively. For example, General Motors' Supercruise system utilizes a video camera to ensure the driver's eyes remain focused on the roadway ahead while the self-driving system is turned on. If the driver looks down, appears distracted, or falls asleep, an alarm goes off, and the vehicle's system gradually disables itself and then pulls the vehicle over should the driver not respond. One important difference between Autopilot and Supercruise is that Autopilot operates on any road, while Supercruise only functions on roads that General Motors has deemed to match the safety parameters of their self-driving system. Most of the other automakers are developing and implementing similar safeguards into their autonomous driving systems.

According to the report, Mr. Musk is a defiant libertarian who risked by violating coronavirus shutdown orders by keeping the company's California factory open. The news report accuses Musk of baiting Tesla owners to abuse its Autopilot technology and risk a fatal accident. And report states that Tesla probably has written some "fine-print legalese providing Tesla a measure of liability protection" while drivers are tragically killed in Autopilot-related accidents. 

Other carmakers have already learned the expensive lessons that a careless attitude toward consumer safety can destroy a company's image and result in legal, civil, and regulatory problems. Many companies have been forced into bankruptcy for deadly, defective auto equipment such as exploding airbags, unintended acceleration, faulty ignition switches, or separating automotive tires. The article poses the question, what will it take to expose the unnecessary Tesla Autopilot deaths? Will family members filed wrongful death lawsuits or class-action lawsuits against Tesla? Because right now, Tesla does not appear to be solving the issue of "Tesla drivers misusing the technology they don't understand" and tragically dying as a result. 

We Know What It Takes To Win!™ 

If you have been injured or lost a family member because of any self-driving vehicle, contact Montlick & Associates, Injury Attorneys, for your free consultation today. Our law firm has been representing those who suffer serious injuries or lost a loved one in an accident for over 38 years.  Our trial attorneys have recovered billions of dollars for our clients through negotiated settlements, litigation/lawsuits, settlement of lawsuits,  jury verdicts, mediation, and arbitration awards.

Please visit our read Montlick & Associates reviews to see what our clients have to say about our commitment to exceptional service.

No matter where you are located, our Product Liability Attorneys are just a phone call away, and we will even come to you. Call us nationwide 24 hours day/7 days a week for your Free Consultation at 1-800-LAW-NEED (1-800-529-6333), or simply dial #WIN (#946) from your mobile phone. You can also visit us online at and use our Free Case Evaluation Form or Free 24-hour live chat.


Please Note:
Many of our blog articles discuss the law. All information provided about the law is very general in nature and should not be relied upon as legal advice. Every situation is different, and should be analyzed by a lawyer who can provide individualized advice based on the facts involved in your unique situation, and a consideration of all of the nuances of the statutes and case law that apply at the time.