A vehicle that can safely drive itself has long been a dream. You could safely check your emails or read about Kentucky basketball while driving to work. Long drives to see relatives or go on vacation could be done overnight while you sleep behind the wheel. This dream is not a reality yet, and how Tesla has approached the technology shows how much further this technology has to go.
How Dangerous is Tesla’s Autopilot? Fatal Injuries Involving Tesla Autopilot
There have been 17 fatal injuries since 2019 involving Tesla’s self-driving technology, Autopilot, reports the Washington Post, with 11 happening in the past year. There were 736 crashes involving Teslas using or believed to be using the feature, causing injuries in this time frame.
These numbers come from the Post’s National Highway Traffic Safety Administration information analysis. The crashes have significantly increased, showing the dangers of relying on Tesla’s self-driving system, mainly as more Teslas are sold and more drivers use Autopilot.
Why is Autopilot Causing Accidents?
Tesla CEO Elon Musk claims that cars using Autopilot are safer than Teslas driven by people, and the technology will eventually lead to safer roads. Whether the total of injuries and deaths caused by Teslas would overall be reduced if all drivers used the system is unknown.
What’s clear is that Autopilot in cars navigating US roads, encountering all kinds of vehicles and pedestrians in more situations, has some serious flaws. One issue is that new Teslas are no longer equipped with radar, and the company turned off the radar on older models.
The technology used to track thunderstorms and airplanes is also used in vehicles to determine if something is in the way, possibly triggering the brakes to avoid a collision. Another technology on some cars, but not Teslas, is lidar. It uses a laser to determine distances between the vehicle to other things (and people).
Why Don’t Teslas Have Better Safety Technology?
Musk states that Tesla needs Autopilot to stand out from other electric vehicles, so it is moving quickly with the technology. He decided two years ago not to use radar, according to the Washington Post. The company stopped putting them in new vehicles and disabled the radar system in existing vehicles.
At the time, radar systems’ supplies were insufficient, and Musk wanted to save money. The cars already had eight cameras, and Musk wished to rely on software to identify objects from the images and react accordingly. Without radar, Teslas could have problems that could cause crashes if the images are obscured by rain or bright sunlight.
Customer complaints are rising, and a lawsuit against Tesla accuses Musk of overstating Autopilot’s capabilities. Vehicle regulators and government officials are increasingly scrutinizing Autopilot, whether Musk over-promised its capabilities (while under-delivering) to boost sales and past safety problems.
In interviews with the Post’s reporters, former Tesla employees assigned to Autopilot blamed the company’s problems on the rush to develop it, cost-cutting actions like eliminating radar (unlike other car companies), and other issues.
Did Tesla Know About Their Safety Issues?
More than 23,000 internal Tesla computer files and documents were leaked to the German publication Handelsblatt. The disclosure, apparently by a Tesla employee, was reported in May. Part of the files is more than three thousand complaints about Autopilot from drivers in the US, Europe, and Asia, reports ARS Technica, dating back to 2015.
Handelsblatt estimates the files contain information on more than a thousand accidents linked to brake problems and thousands of other Autopilot safety complaints. They include:
- A California doctor reports that in 2021 she was starting to turn into a parking lot when her car suddenly accelerated and crashed into a concrete barrier
- A Swiss Tesla owner complained about a dozen incorrect braking incidents. After the car was repaired, his Tesla made another emergency stop for no reason
- One Michigan driver in 2019 experienced his Tesla slamming on its brakes without cause, throwing him forward into his seatbelt, and causing his car to be rear-ended
- Two German reports include a vehicle driving into a freeway median barrier due to unwarranted emergency braking and a driver stating that Autopilot steered his Tesla into oncoming traffic
Is Tesla’s response to these problems to be more open and transparent? Just the opposite. The company responds to owner complaints verbally in telephone calls, not in writing, perhaps to limit potential evidence in future lawsuits and government investigations. For each incident, according to Handelsblatt, there are issues for “technical review,” and employees regularly make clear the report’s for “internal use only.”
‘Each entry also contains the note in bold type that information, if at all, may only be passed on “VERBALLY to the customer”…”Do not copy the report below into an email, text message or leave it in a voicemail to the customer,” it continues. Vehicle data should also not be released without permission. If, despite the advice, “a legal involvement cannot be prevented”, this must be recorded.’
Tesla appears to think no news is good news, but their Autopilot is terrible news for Tesla’s drivers and those on the road with them.
Injured in a Vehicle Accident? Get Help From an Attorney You Can Trust
If you have questions or want legal representation by an experienced attorney after a recent vehicle accident, call The Fleck Firm at (270) 446-7000 for a free consultation. We’ll discuss the accident, your injuries, Kentucky law, and what you should do to proceed so you can be fully informed about your situation and all the hurdles that may be in your way. Insurance companies have lawyers. You should have one too.