Majority of people don't want use of their personal data to result in harm or corporate profit

personal data
Credit: CC0 Public Domain

Clear communication about how people's data is used won't necessarily alleviate their concerns about it, new research from the University of Sheffield has found.

The "Living With Data" project sought to understand people's perceptions of how data about them is collected, analyzed, shared and used ("data uses") in three public sectors: welfare, health and public service media.

The research found that people are concerned and often confused about commercial companies accessing, using and profiting from data initially gathered for the , such as , or by public sector organizations like the NHS.

The research shows clarity about commercial involvement in public sector data uses may reduce confusion, but it won't reduce concerns. It was found that people who know most about data uses are, in fact, the most concerned about them.

Of particular concern was the involvement of big tech companies like Amazon and Palantir in the NHS COVID-19 Data Store. Only 5% of people support commercial companies profiting from the use of personal data, and only one in ten are not concerned about commercial companies being involved in providing public services like health or welfare.

Experts say this finding highlights the urgent need for public sector organizations to review their data-driven systems, especially those from which commercial companies can profit.

Professor of Digital Society Helen Kennedy, from the University of Sheffield, said, "One way to do this is for public sector data practitioners to consider alternative ways of delivering data services. This won't be easy due to the global monopoly on the provision of these services, but it's not impossible. Changes to the data ecosystem could give the public more confidence in the use of their personal data that policymakers and users are keen to see."

Different demographic groups have differing concerns about what their data is used for, demonstrating that also play a role in shaping people's attitudes to data uses.

Disabled people were found to be more positive about the sharing of health data than people who did not have a disability, and white people trusted the police's data uses more than Black, Asian and other racially minoritized people.

The research also found that trusted their GP more than , and LGBTQ+ people trust health organizations less than heterosexual cisgender respondents.

However, despite these differences, there were also similarities. The research found that people from different groups were aware that data uses can reinforce inequalities, and they don't want data uses to have negative consequences for people from disadvantaged or . For example, there is concern that people who don't have access to the relevant technology in their homes are excluded from using data-driven systems. They want data-driven systems to be inclusive "for all communities," as one participant in the research commented.

Professor Kennedy added, "Data policymakers and data practitioners need to acknowledge that there is widespread concern about the potentially discriminatory impacts of different data-driven systems. Then they need to address this problem. The way society uses data needs to change so it can eliminate harms and its use is in the public or social interest. Sometimes, in order to do these things, specific data uses need to stop, such as those that deepen inequalities. Regardless of how well data uses are currently communicated, the public will continue to be concerned if these changes are not made."

The report recommends that eliminating harms on people from disadvantaged and minority groups and ensuring data uses are in the public or social interest should drive change.

Citation: Majority of people don't want use of their personal data to result in harm or corporate profit (2022, October 24) retrieved 15 July 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Public support needed to tackle racial and other biases in AI for health care


Feedback to editors