Biases in algorithms hurt those looking for information on health

The Health Information National Trends Survey reports that 75% of Americans go to the internet first when looking for information about health or medical topics. YouTube is one of the most popular online platforms, with billions of views every day, and has emerged as a significant source of health information.

Several public health agencies, such as state health departments, have invested resources in YouTube as a channel for health communication. Patients with chronic health conditions especially rely on social media, including YouTube videos, to learn more about how to manage their conditions.

But recommendations on such sites could exacerbate preexisting disparities in health.

A significant fraction of the U.S. population is estimated to have limited health literacy, or the capacity to obtain, process and understand basic health information, such as the ability to read and comprehend prescription bottles, appointment slips or discharge instructions from health clinics.

Studies of health literacy, such as the National Assessment of Adult Literacy conducted in 2003, estimated that only 12% of adults had proficient skills. This has been corroborated in subsequent studies.

I'm a professor of information systems, and my own research has examined how such as YouTube widen such health literacy disparities by steering users toward questionable content.

On YouTube

Extracting thousands of videos purporting to be about diabetes, I verified whether the information shown conforms to valid medical guidelines.

I found that the most popular and engaging videos are significantly less likely to have medically valid information.

Users typically encounter videos on health conditions through keyword searches on YouTube. YouTube then provides links to authenticated medical information, such as the top-ranked results. Several of these are produced by reputable health organizations.

Recently, YouTube has adjusted how search results are displayed, allowing results to be ranked by "relevance" and providing links to verified medical information.

While videos from sources like the CDC might be the most informative, they are not always the most popular.

However, when I recruited physicians to watch the videos and rate them on whether these would be considered valid and understandable from a patient education perspective, they rated YouTube's recommendations poorly.

I found that the most popular videos are the ones that tend to have easily understandable information but are not always medically valid. A study on the most popular videos on COVID-19 likewise found that a quarter of videos did not contain medically valid information.

The health literacy divide

This is because the algorithms underlying recommendations on platforms are biased toward engagement and popularity.

Based on how digital platforms provide information to search queries, a user with greater health literacy is more likely to discover usable medical advice from a reputed health care provider, such as the Mayo Clinic. The same algorithm will steer a less literate user toward fake cures or misleading medical advice.

This could be especially harmful for minority groups. Studies of health literacy in the United States have found that the impact of limited health literacy disproportionately impacts minorities.

We do not have enough studies on the state of health literacy among minority populations, especially in urban areas. That makes it challenging to design health communication aimed at minorities, and interventions to improve the utilization of existing health care resources.

There can also be cultural barriers regarding health care in minority populations that exacerbate the literacy barriers. Insufficient education and lack of self-management of chronic care have also been highlighted as challenges for minorities.

Algorithmic biases

Correcting algorithmic biases and providing better information to users of technology platforms would go a long way in promoting equity.

For example, a pioneering study by the Gender Shades project examined disparities in identifying gender and skin type across different companies that provide commercial facial recognition software. It concluded that companies were able to make progress in reducing these disparities once issues were pointed out.

According to some estimates, Google receives over a billion health questions everyday. Especially those with low health literacy have a substantial risk of encountering medically unsubstantiated information, such as popular myths or active conspiracy theories that are not based on scientific evidence.

The World Economic Forum has dubbed health-related misinformation an "infodemic." Digital platforms where anyone can engage also make them vulnerable to misinformation, accentuating disparities in health literacy, as my own work shows.

Social media and search companies have partnered with health organizations such as the Mayo Clinic to provide validated information and reduce the spread of misinformation. To make information on YouTube more equitable, those who design recommendation algorithms would have to incorporate feedback from clinicians and patients as well as end users.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.The Conversation

Citation: Biases in algorithms hurt those looking for information on health (2020, July 14) retrieved 28 March 2024 from https://techxplore.com/news/2020-07-biases-algorithms-health.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Study suggests guidelines to improve Youtube videos on chronic health care conditions

7 shares

Feedback to editors