Focus on human factors in designing systems

Focus on human factor in designing systems
A new study has found one of the challenges in designing systems that involve people interacting with technology is to tackle the human trait of overconfidence. Credit: QUT

A new study has found one of the challenges in designing systems that involve people interacting with technology is to tackle the human trait of overconfidence.

The study, published in the journal IEEE Control Systems, takes a novel multidisciplined approach in studying cyberphysical human systems. The research considers the relationship between people and both from the perspective of control system engineering and behavioral economics.

The research by QUT's Cyberphysical Systems Professor Daniel Quevedo, and Marius Protte and Professor René Fahr, both from Paderborn University in Germany, looks at the impact that can make on an engineered system.

Professor Quevedo said control system engineers generally did not examine the interaction between people and the systems they were in, and how their choices could impact on the system.

To explain how unpredictable human decisions could impact on a controlled system, Professor Quevedo said an example was if he was planning a drive using a navigation system and was offered alternative routes.

"I make my own based on the information and drive. And that affects the whole traffic system," Professor Quevedo said.

"There is this problem about what information does the car system give me so that I behave in one way or another.

"That's just for one car. With traffic, there are many cars. What information should we get so that we behave in one way or another? How do our actions work?"

While the system's designer expects humans to take the fastest route, they might take a different route. If enough people decided to take an alternative route, then the traffic flow predictions of the system would need to be reconsidered.

Professor Quevedo said successful design of 'human-in-the-loop' control systems required an understanding of how humans behaved.

He said an interesting issue was that people, unlike machines, did not necessarily improve their performance through immediate and frequent feedback.

"Given the immense complexity of human behavior, there's no clear way to create appropriate models for human decision making," Professor Quevedo said.

A new study has found one of the challenges in designing systems that involve people interacting with technology is to tackle the human trait of overconfidence. Credit: QUT

In the study, the researchers looked at how people behaved when given the task of piloting a drone and found that frequent feedback about the quality of the piloting decisions made, may lead to poor performance.

"While more information is commonly considered to result in better decisions, human susceptibility for perceptual biases in response to high information supply must be considered," Professor Quevedo said.

"Otherwise, individuals might take unnecessarily high risks, rendering thoughtfully designed policies inefficient.

The study highlights that people often overestimate their ability at a task, such as believing they are better than average drivers, or they succumb to the "hot hand fallacy" from basketball which links the likelihood that a player will score in the future to his past successes in throwing.

"If you win, you think you're doing really well, you fall in love with yourself," Professor Quevedo said.

"As a control engineer, I always tended to assume that cooperative people somehow just do what they're told because they're part of a system.

"We need to incorporate a model of human behavior, but human behavior is a difficult thing.

"You don't want to overload people with information because they can't process all of it. But it's much more refined than that."

This multidisciplinary study of human behavior through behavioral economics and control system engineering is a start for future research.

"Putting the worlds together is the first step for us. Now we want to continue," Professor Quevedo said.

"The current work exposes the human as an under observed source of errors in human-in-the-loop control systems.

"Future areas of research need to be how to design mechanisms on when to pass on information and how to pass on to human decision makers."

More information: M. Protte, R. Fahr and D. E. Quevedo. Behavioral Economics for Human-in-the-Loop Control Systems Design: Overconfidence and the Hot Hand Fallacy. IEEE Control Systems Magazine, vol. 40, no. 6, pp. 57-76, Dec. 2020, DOI: 10.1109/MCS.2020.3019723

Citation: Focus on human factors in designing systems (2020, December 9) retrieved 28 March 2024 from https://techxplore.com/news/2020-12-focus-human-factors.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Scientists simplify model for human behavior in automation

8 shares

Feedback to editors