In the latest of a series of actions brought against Tesla for alleged failures of its “Tesla Autopilot” driver assist system, Shawn Hudson filed suit in Florida state court on October 30 after he plowed into the back of a disabled vehicle at 80 miles per hour while using the Autopilot system in his 2017 Tesla Model S. Hudson, who claims that he had his hands on the wheel at the time of the crash but was not actively “driving,” alleges that Tesla Autopilot malfunctioned by failing to slow or stop the Model S prior to impact. Tesla Autopilot is an optional driver-assistance system that utilizes eight surround cameras, twelve ultrasonic sensors and forward-facing radar. As of 2017, it featured lane centering, adaptive cruise control, self-parking, the ability to automatically change lanes without driver confirmation, and the ability for the car to be summoned to and from a garage or parking spot. The current iteration also enables Teslas to transition from one freeway to another and exit the freeway.
The complaint alleges that Tesla advertises Autopilot as a system that enables the vehicle to drive itself from one point to another with minimal driver oversight — the driver need only place their hand on the wheel sporadically, but the vehicle will “do everything else.” Hudson’s suit also claims that when he visited a Tesla dealership prior to purchasing his Model S, the salesperson allegedly echoed this sales pitch, telling Hudson that in the event a vehicle detects a hazard — including a stationary vehicle — Autopilot alerts the driver so that he or she can take control if necessary. Accordingly, Hudson opted to purchase Tesla Autopilot to ease the stresses of his commute.
Tesla’s website contains the following disclaimer: “Every driver is responsible for remaining alert and active when using Autopilot, and must be prepared to take action at any time.” Additionally, and in contrast to basic Autopilot, consumers may purchase Full Self-Driving Capability for their Tesla, which is “designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.”
Hudson claims that Autopilot does not — and could not — function as advertised. His complaint alleges that Tesla “duped” its customers though a “pervasive national marketing campaign” and “purposefully manipulative sales pitch.” The causes of action against Tesla include (1) strict liability arising out of the autopilot system’s allegedly defective design, manufacture and warnings; (2) negligence; (3) breach of implied warranty; (4) misrepresentation; (5) misleading advertising; and (6) violation of Florida’s Deceptive and Unfair Trade Practices Act.
Alleged failures of Tesla’s autopilot system have dogged the auto manufacturer in recent years. The spate of recent lawsuits include a deadly 2017 crash involving a Tesla Model X (the SUV model) on a California highway, a fatal 2016 crash on Highway 27 in Florida, and a 2018 lawsuit brought by a Utah woman arising out of a rear-end collision while operating a 2016 Model S in autopilot mode. Additionally, in May, Tesla settled claims of Model S and Model X drivers in California, which were based on allegations that the company delayed safety features and an upgrade to its autopilot system, instead rolling out vehicles with defective traffic awareness features.
These lawsuits suggest that manufacturers who integrate autopilot or driver-assist technologies into vehicles they sell should consider explicit warnings regarding the extent and capabilities of these technologies, as well as any down-the-road updates or maintenance services required to maintain peak performance to cut against any future liability. Additionally, manufacturers should consider refraining from advertising similar systems as autonomous to emphasize the continued need for driver attention and, when necessary, control of the vehicle.