Are drivers too reliant on Tesla’s Autopilot feature?

On Behalf of | Jan 14, 2020 | Car Accidents |

The National Highway Traffic Safety Administration (NHTSA) is investigating two recent crashes involving Tesla Model 3 vehicles that may have been on Autopilot when the collisions occurred. If they were, the Autopilot system failed to prevent the crashes. That’s bad news considering that Tesla plans to put fully automated vehicles on the roads within months.

The first crash occurred on Dec. 7 in Connecticut. The Model 3 struck a police car, although no one was injured. The driver said he was using the Autopilot.

The second occurred on Dec. 29. According to the Associated Press, the Model S exited a freeway in California at high speed, ran a red light and rammed into a Honda Civic. Two people in the Civic were killed. It is unclear whether the Autopilot system was engaged.

Tesla’s Autopilot isn’t meant to replace drivers

Tesla says its Autopilot is merely meant to assist drivers who are actively focused on the road. Tesla has repeatedly said that drivers must pay attention and be ready to spring into action if the system fails. It has cautioned that the system cannot prevent all crashes.

Perhaps because of the name, however, many drivers have assumed it is safe to let the car take over their driving responsibilities. This has led to a number of accidents. NHTSA is currently investigating 13 such crashes.

“At some point, the question becomes: How much evidence is needed to determine that the way this technology is being used is unsafe?” asked the head of the nonprofit Center for Auto Safety. The Center and other advocacy groups are calling on NHTSA to issue regulations requiring more effective action by Tesla.

One proposal is for Tesla to limit the use of the Autopilot to four-lane divided highways with no cross traffic. Another is for Tesla to install a better system to monitor the driver. This could be an eye-tracking system, for example, that would ensure the driver’s eyes are on the road. Currently, Tesla only requires the driver to keep their hands on the steering wheel, which federal investigators say allows people to zone out.

A defective product?

A former NHTSA administrator told the AP that the system is defective. Legally, products can sometimes be considered defective even if consumers use them incorrectly, as long as the incorrect use is foreseeable to the manufacturer or others in the supply chain. Here, Tesla has had actual notice, in the form of several other accidents, that drivers are over-relying on the Autopilot system, yet the company has taken no action.

Tesla says the computer in its fully automated vehicles will be more powerful, but it will rely on the same cameras and sensor the Autopilot uses.