This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:


trusted source

written by researcher(s)


Commentary: Companies oversell the self-driving capabilities of their cars, with horrific outcomes

Companies oversell the self-driving capabilities of their cars, with horrific outcomes
Research shows that drivers rely on self-driving features when they believe the road conditions are easy enough for the car to handle. Credit: Shutterstock

In mid-February, Tesla announced the recall of over 350,000 vehiclesmore than 20,000 in Canada —due to a problem with its "Full Self-Driving Capability" system. This self-driving feature was found to possibly cause vehicles to misbehave when entering intersections or exceed the speed limits, posing a risk for safety.

This is just another instance of vehicles equipped with automated driving technology falling short of their safety expectations. In September 2022, a driver on the Queen Elizabeth Way near St. Catharines, Ont., was caught asleep at the wheel of a Tesla. The appeared to be operated by a semi-automated system with no monitoring from the driver, which is in direct violation of the requirements for these systems.

Numerous incidents involving Teslas have been reported recently. These incidents occurred so frequently that the U.S. National Highway Traffic Safety Administration started a formal investigation in August 2021.

Early data show that in a 12-month period, 367 were reported involving semi-automated systems. Of these, 273 crashes alone involved Tesla vehicles. While these numbers are far lower than those involving all road vehicles, they at least question these systems' purported safety.

Driver misconceptions

Human factors research is a cross-disciplinary field of study that draws from psychology, engineering and kinesiology. Applying this approach to analyzing and understanding the Tesla crashes reveals that there are several issues contributing to these incidents.

For years, the has hinted at the fact that these systems may be more capable than they actually are. For example, referring to these systems with misleading names—like "autopilot" or "self-driving"—may cause drivers to believe a car can drive without human interventions, while in fact it cannot.

Research also revealed that, when purchasing a new vehicle, about a quarter of drivers never receive any information about the assistance systems available on their vehicle from the dealership. One study found that of the customers that received assistance, only nine percent were able to test drive the systems before taking the car home.

These issues ultimately lead to drivers forming incorrect assumptions of the car's capabilities. This, in turn, leads drivers to using these vehicles believing that they are more advanced and autonomous than they actually are.

Inside Edition reports on two recent deadly crashes involving Teslas.

Misconceptions about these systems' capabilities are more likely to lead to unsafe behaviors when drivers believe the road and are uneventful enough for the system to handle.

In an ongoing study that is among the first in Canada, our research shows that some drivers may "tune out" when operating these vehicles on seemingly uneventful roads with long straight stretches with relatively low traffic volumes. This is because drivers may be inclined to believe that these systems are sufficiently advanced to handle such simple driving tasks.

Regulating self-driving systems

Research has pointed at the unintended consequences of these vehicles for years, identifying how human and technological factors interact with semi-automated systems, with lethal consequences.

In Canada and abroad, governments are now starting to reckon with the sometimes dystopian reality of these vehicles. And they have recognized the need to advance legislative frameworks for the safe deployment and efficient regulation of these systems.

With the Tesla crashes, we are witnessing a push to introduce new technologies for no other reason than its availability, regardless of the impact. We have seen this in other areas, such as aviation.

Since 2014, Elon Musk has promised the arrival of self-driving cars. But they are not here yet. So, with this in mind, what can drivers do?

First, they need to be aware that none of the vehicles on the market today are actually , regardless of how pricey or advanced they seem—vehicles still require active supervision from a human driver.

Which means that eyes ought to stay on the road, hands must stay on the and, more importantly, attention must be paid to the surrounding traffic and road environment at all times.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation: Commentary: Companies oversell the self-driving capabilities of their cars, with horrific outcomes (2023, February 27) retrieved 23 September 2023 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Justice Department seeks Tesla automated driving documents


Feedback to editors