This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

proofread

Opinion: Tesla recalls over 2 million vehicles, but it still needs to address confusing marketing

tesla
Credit: CC0 Public Domain

On Dec. 12, the U.S. Department of Transportation issued a recall regarding Autosteer, a feature included in Tesla's semi-autonomous suite Autopilot, because "there may be an increased risk of a collision."

The recall, which affects over two million vehicles in the United States, is a watershed moment in modern automotive history, as it affects nearly every Tesla on the road in the U.S.

Transport Canada extended the recall to 193,000 Tesla vehicles in Canada.

Tesla says only vehicles in the U.S. and Canada are affected by the recall.

Unlike technologies that can be defined as fully autonomous—like elevators where a user steps in and pushes a button—Autosteer is not an , despite what drivers may think.

A 2018 study found that 40 percent of drivers believed Tesla vehicles are capable of being fully self-driving. A similar study concluded that participants "rated [Autopilot] as entailing less responsibility for the human for steering than 'high automation,' and it was not different from 'autonomous' or 'self-driving'."

Instead, Tesla Autopilot falls into the category of level 2, or semi-autonomous, systems. This system can handle steering and accelerating, but the human driver must stay vigilant at all times.

Confusing communication

In human factors research, believing that a system can do something it can't is referred to as mode confusion. Mode confusion not only misleads the user but also has direct safety implications, as in the 1992 Air Inter Flight 148 plane crash in France. That situation was the direct result of the pilot operating the aircraft system in a mode different from its original design.

PBS covers the safety issues that led to the December recall of Tesla vehicles in the United States and Canada.

Safety researchers have sounded the alarm about risks inherent to semi-autonomous systems. In fully manual and fully autonomous modes, it is clear who's responsible for driving: the human and the robot driver, respectively.

Semi-autonomous systems represent a gray area. The human driver believes the system is responsible for driving, but as lawyers representing Tesla have already successfully argued, it is not.

A second important factor is also the role of misleading information. The as a whole has, for years, tiptoed around the actual capabilities of autonomous vehicle technology. In 2016, Mercedes Benz pulled a TV commercial off the air after criticism that it portrayed unrealistic self-driving capabilities.

More recently, Ashok Elluswamy, director of Autopilot software at Tesla, said the 2016 video promoting its self-driving technology was faked.

False sense of security

Thinking that a system is fully autonomous creates a false sense of security that drivers may act on by losing vigilance or disengaging from the task of supervising the system's functioning. Investigations on prior accidents involving Tesla Autopilot showed that drivers' overreliance on the semi-autonomous system indeed contributed to some reported crashes.

The recall is a logical, albeit long-awaited, effort by transportation agencies to regulate a problem that researchers have attempted to draw attention to for years.

In her 2016 study, Mica Endsley, a pioneer in the on user automation, highlighted some potential safety risks of these systems. A more recent study published by my research group also shows the dangers that operating semi-autonomous systems pose to drivers' attention.

With the recall, Tesla will be releasing over-the-air software updates that are meant to "further encourage the driver to adhere to their continuous supervisory responsibility whenever Autosteer is engaged." These may include additional "visual alerts" and other additions to the system to help stay vigilant while Autosteer is engaged.

In all, although this may be the first time regulators strike a direct, concrete blow at Tesla and its marketing, it won't be the last.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation: Opinion: Tesla recalls over 2 million vehicles, but it still needs to address confusing marketing (2023, December 20) retrieved 27 April 2024 from https://techxplore.com/news/2023-12-opinion-tesla-recalls-million-vehicles.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Ethics on autopilot: The safety dilemma of self-driving cars

16 shares

Feedback to editors