Blink and you miss it. That’s how we used to describe lighting-fast never before seen and never to be seen again events. However, with the current pace of technological innovation, every day seems to bring a new blink-and-you-miss-it moment. Blink and you miss the latest software update already made obsolete by the latest tech product. Almost no aspect of our lives is immune from the rapid pace of inventions, updates, and new products. 

When major changes impact a few apps or our phones, the learning curve isn’t so bad—a few days or weeks before we get used to the changes. But the wave of technical innovation sweeping the country has in some cases exposed us to new hazards. In no area is this clearer than our cars. As manufacturers compete to produce autonomous vehicles, our city streets and highways have become their lab. 

Unfortunately, vehicles such as cybertrucks and others that use Advanced Driver Assistance Systems still cause accidents. In some cases, the car’s purported autonomous driving features may be the very reason for a crash. Victims of a cybertruck or other advanced driver assistance vehicle accident are left to wonder what was the car manufacturer’s responsibility and whether a self-driving car accident lawyer can help them uncover the truth behind the accident to win the justice they deserve.

What is A Self-Driving Car?

A simple, but important question to consider is: what exactly is considered a self-driving car? According to the National Highway Traffic Safety Administration, no consumers today have the option to purchase a vehicle with full self-driving automation. In other words, no customers can purchase a truly self-driving car. 

Yet, in states like Georgia, Waymo offers driverless taxis that are, in certain conditions and on certain routes, truly automated. People inside these cars are only passengers and aren’t expected to take over any driving responsibilities. Further compounding the confusion are recognized brands like Tesla releasing vehicles with optional upgrades titled “Full Self-Driving.” So what are self-driving cars? Thankfully NHTSA, engineers, and other regulators have classified 6 levels of automation ranging from no automation to full automation.

Levels 0 to 2: A Human Drives, Controls, and Monitors

At the lower levels of automation, people are driving the car and monitoring the road. The car’s system provides additional monitoring, warning, and at higher levels, some control. 

  • Level 0: At this level, there’s no automation at all. The vehicle provides some assistance in the form of alerts or braking. These technologies are becoming standard on almost all new cars sold and include lane departure warnings and forward collision warnings.
  • Level 1: At this level, the system provides some control over the vehicle through features like adaptive cruise control which uses radar and lidar to judge and maintain speed to keep the car a safe distance from the car in front.
  • Level 2: Also known as Advanced Driver Assistance Systems (ADAS). The “FSD” that the Cybertruck and other Teslas come equipped with, is a Level 2 ADAS system. ADAS provides assistance with acceleration, braking, and steering but the human driver must constantly monitor the road. 

The most important consideration for these levels of automation is that people are considered the drivers, not the car itself. A person drives the car, maintaining their hands on the steering wheel and focusing on the road. Cars with ADAS should also come equipped with technology to monitor whether the driver has their hands on the wheel. 

Level 3: Conditional Automation

Cars equipped with Level 3 Conditional Automation are not widely available for purchase, with only high-end brands like Mercedes-Benz offering cars with this feature. At level 3, the car can be said to be self-driving but only under the right conditions. Should the car encounter a problem or the conditions change, the system prompts the human to take over. Construction, cloudy skies, or unclear lane markings all require the driver to step in.  

Levels 4 and 5: Self-Driving Cars

Level 4 and 5 automation are more aspirational than reality. Even self-driving Waymos run into limitations in extreme weather conditions that might make them Level 3 rather than Level 4 automation. But at Level 4 and 5 automation, the automated system drives the car and the human occupant is merely a passenger and does not need to step in to drive. The main difference between Levels 4 and 5 automation is that at Level 5, the car is self-driving anywhere, rather than within a limited service area. 

Self-Driving vs. True Self-Driving

While safety regulators may only consider cars self-driving at Levels 4 and 5, brands labeling software packages as “Full Self-Driving” has created confusion. As a result, many consumers consider vehicles with Level 2 ADAS capabilities to be self-driving cars. We will use the terms, self-driving (or ADAS – advanced driver assistance systems) and true self-driving (ADS – automated driving systems) to distinguish between the consumer and industry understandings of the technology in question. The distinction does have consequences, especially as judges and juries consider the implications of a self-driving car accident

Do Self-Driving Features Cause Accidents?

Cars with Level 2 automated features and higher are still fairly new and research is ongoing into their effects on driving. Manufacturers tout the ability of self-driving cars to increase safety, reduce accidents, and ultimately decrease the annual number of accident-related fatalities. 

Meanwhile, ADAS crashes total in the thousands—with Teslas alone totaling over 2,000 crashes. Tesla has 40 times more reported crashes involving its ADAS than GM, the brand with the second largest total reported crashes. These figures may also underestimate how many crashes involve these self-driving systems and just aren’t reported as such.

The DOT requires that manufacturers collect and report crashes involving Level 2 ADAS only in crashes where the ADAS was in use within 30 seconds of the crash and the crash involved a vulnerable road user (like a pedestrian) being struck, a fatality, airbag deployment, or an individual going to the hospital. Further complicating the picture is the fact that Tesla doesn’t always report this information or reports that the information was destroyed during the collision. 

True Self-Driving Car Accidents Are Rarer

Studies have begun to show that self-driving cars with Level 4 ADS are at a lower risk of accidents than human-driven vehicles in most conditions. On clear days and straight roads, these cars have performed safer than human drivers. However, at dawn and dusk these cars are five times more likely to get into an accident. Since 2021 there have been 953 accidents in California involving Level 4 True self-driving cars and 14 in Georgia. 

Unfortunately, fewer studies have been done comparing accidents involving ADAS and human drivers. But compared to ADS, ADAS is more likely to be involved in an accident in rain or traffic. In one study, ADAS experienced an issue once every 8 miles. The ADAS struggled to stay in its lane, or would disengage with little notice to the human driver. These frequent issues could lead to more accidents as more drivers turn to ADAS during their commutes. 

At this point, it’s unclear whether these features are causing more accidents than would otherwise occur if these systems did not exist. While the issues described above may make an accident more likely, and there are instances where a crash might not have happened had the ADAS not been engaged, there may be other instances in which the ADAS performed better than a human driver would. In those situations, there’s no crash and therefore researchers have no data to study. 

Do Recalls Impact Self-Driving Car Accidents?

Cars, like any other commercial product, can be subjected to recalls when manufacturers or regulators discover problems that make the car unsafe. In the past, these recalls have been shown to reduce accidents by 20% for the make and model recalled. 

One notable self-driving car recall involved Tesla, after it rolled out its “Full Self-Driving (beta)” and “Full Self-Driving (supervised)” features. After NHTSA raised concerns, Tesla “recalled” millions of vehicles by sending out a software update intended to make it more difficult for drivers to become distracted while using the ADAS. 

However, investigations into the Tesla Autopilot and Full Self-Driving features found that the recall did not address the underlying safety concerns. Additionally, these investigations found that Tesla was unique among auto manufacturers in having both a “weak driver engagement system” and “Autopilot’s permissive operating capabilities.” In other words, Tesla’s ADAS made it relatively easy for the ADAS to remain engaged even after the drivers lost focus on the road. 

What Makes a Self Driving Car Accident Case Different? 

In a cybertruck accident or an accident involving some other self-driving vehicle, a self-driving car accident lawyer will ask many of the same questions as any other experienced auto accident lawyer. While there’s no typical auto accident case, an experienced lawyer will look for elements of negligence or product liability that caused or contributed to the accident and their client’s injuries. They’ll consider what the other drivers involved in the accident did and how that compares to what was reasonable or whether a vehicle or other product was responsible for the injuries. 

Was the Driver Negligent?

Many personal injury cases, including automobile accident lawsuits, center around a question of negligence. What did the other driver or drivers in an accident do, and what was reasonable for them to do under the circumstances? In determining the answer to this question, lawyers, judges and juries look at all the relevant circumstances surrounding the accident. 

In self-driving car accidents, an extra consideration is added to this investigation. Was the ADAS engaged, and did the driver react reasonably regarding that technology? Drivers are supposed to maintain hands on the steering wheel and attention on the road when using these technologies. If they failed to do that, was that driver negligent? If they were paying attention, could they have prevented the accident?

First-Party Self-Driving Car Accident Claims

When the human-driver of the “self-driving car” is an injured victim, the questions become more complex. If the ADAS features were engaged at the time of the accident, but the driver wasn’t paying attention, will that prevent them from receiving compensation for their injuries?  In most cases, the answer is yes, unless it can be shown that the system malfunctioned and did not operate as intended or advertised.

Pursuant to O.C.G.A. § 51-12-33 (2024), if a driver is 50% or more at fault for the accident, they cannot recover. If their fault is said to be less than 50%, their damages are reduced by the amount they’re at fault. If, hypothetically, the damages would be $10,000 but the driver of an ADAS equipped car was 40% at fault, they would only receive $6,000. But if they’re 51% at fault and the ADAS is 49% at fault, they recover nothing.

Is the Car to Blame in Self-Driving Car Accidents?

Whether the driver was negligent and to what extent will in some ways depend on how much blame should be directed towards the car and its manufacturer. Filing a claim against a manufacturer for their products is known as product liability law.  Each state can choose how it wants product liability to operate, regulating who, how, and when a manufacturer is liable for injuries.

What is Product Liability?

Georgia’s product liability law is found under O.C.G.A. § 51-1-11 (2024). It creates liability for manufacturers when their products cause injuries because the product wasn’t fit for sale or not “reasonably suited to its intended use.” Importantly, product liability isn’t limited to customers, as the law allows those “reasonably affected by the product” to bring a claim. 

Additionally, manufacturers must warn consumers of hazards when they learn of a product’s defect. This includes warning customers if a foreseeable use of their product is unsafe. The failure to warn could open a manufacturer up to liability. So how does this impact a self-driving car accident case? 

Product Liability and Self-Driving Car Accidents

Due to the lack of many Level 4 or 5 self driving vehicles, self-driving car accident lawyers are far more likely to encounter a claim involving ADAS self-driving cars. Recall that when ADAS features are equipped the driver is supposed to be required to maintain hands on the steering wheel and eyes on the road. Should they fail to do so, one scholar warns, manufacturers will attempt to argue this is the end of the discussion, claiming that the driver’s failure was the accident’s sole cause. 

However, judges and juries don’t need to accept this narrative full stop. The manufacturer may have failed to warn consumers regarding a foreseeable use of their car’s ADAS. Consider the fact that Tesla calls its ADAS system, “Full Self-Driving,” a Tesla customer could believe that “Full Self-Driving” means they don’t need to do anything while the car is in motion. A jury might consider this belief reasonable given the name of the feature, common understanding of those terms, promises made by company executives, and possibly poorly written or unclear instructions on how to operate the system. 

That failure to warn may even be compounded by the investigations into Tesla’s Level 2 ADAS. Tesla was deemed unique among manufacturers in how much control the ADAS had while providing comparatively little monitoring of the driver. When the system is engaged, it’s supposed to warn drivers of obstructions and hazards. But it isn’t always clear if the warning gives a driver enough time to react, even when they are paying attention to the road. 

Beyond a failure to warn, there may be defects in the very design of the product. For example, is Tesla’s “weak driver engagement system” paired with “Autopilot’s permissive operating capabilities” evidence of a flawed design? Providing answers to questions like these is the work of an experienced self-driving car accident lawyer

How a Self-Driving Car Accident Lawyer Helps

One of the most important aspects of a personal injury claim is evidence. However, finding, cataloging, and analyzing this evidence requires time and effort that injured victims could better spend focusing on their healing. That’s why calling a trusted attorney can help victims find the evidence they need to secure justice. In a self-driving car accident case, this involves the 3 C’s: the car, the collision, and the company. 

The Car

An experienced attorney can use information from the cars involved to determine what happened and why. In a self-driving car accident claim, the self-driving car can provide essential information. 

  • Telemetrics system: The telemetrics system should record information about the car including the speed, maneuvers, and whether the ADAS was engaged at the time of the accident. This information can help determine what information the driver knew at the time of the crash and what they did. 
  • Airbags: If the airbags deployed and mandatory reporting was not done this could indicate potential issues within the car or company. Or if airbags failed to deploy when they should have, this might indicate product liability. 
  • Damage patterns: The damage to all cars involved helps determine what happened. Where a car gets hit and how much damage follows can indicate the speed the car was traveling, and what maneuvers the driver or ADAS may have attempted to avoid the crash. 
  • Warnings: The telemetrics should indicate whether the driver was warned of the obstruction or hazard. If there wasn’t a warning or it was ignored, this might speak to product liability or a negligent driver. 

The Collision

Similar to the car, the entire collision scene is a focal point of a self-driving car accident lawyer’s investigation. 

  • Accident scene: The road signs, conditions, and hazards can all indicate what happened, where, and how. They may also indicate what role the ADAS played. 
  • Witnesses: The attorney will also attempt to speak with witnesses present at the accident scene to get a more detailed picture of what occurred and how. 
  • Damaged property or people: Signs of additional damage can show what preceded the crash or what happened immediately after. Both provide clues to what caused the collision in the first place. 
  • Police report: The police report is one of the first records of an accident and juries and judges trust it when assessing negligence. Consulting this record gives attorneys a starting point. 

The Company 

While analyzing the other elements, the attorney will also dedicate their time towards uncovering what the self-driving car manufacturer knew and what steps they took to prevent or reduce accidents.

  • Coding: A self-driving car accident lawyer will investigate the ADAS code for evidence of defects, whether in design, malfunctions, or warnings.
  • Alternatives explored: An experienced attorney will investigate what cost-effective alternatives existed that could have prevented the accident. 
  • Recalls: An attorney will look to see whether recalled parts or software were responsible for the accident and injuries.
  • Warnings: The attorney will look for evidence that warnings were not provided or inadequate given the foreseeable harms.  

Injured in a Cybertruck Accident? Call Montlick.

If you’ve been injured in a self-driving car accident, call Montlick. Our experienced team of car accident attorneys are no strangers to the investigative work required to uncover negligent drivers and dangerous manufacturers. They dedicate themselves fully to their clients’ needs and fight for the justice they deserve. 

Contact us today to see how Montlick can make a difference. Your initial consultation is free.