Uber and Lyft overcharge riders going to and from disadvantaged areas

Uber and Lyft overcharge riders going to and from disadvantaged areas
Credit: Harrison Jones/GW Today

Thinking of calling a rideshare service? Uber and Lyft passengers going to and from low-income and non-white neighborhoods may pay higher prices, researchers at the George Washington University recently found.

Aylin Caliskan, an assistant professor of computer science in the School of Engineering and Applied Science, and doctoral student Akshat Pandey analyzed a public data set from the city of Chicago containing the times, pickup locations and destinations of more than 100 million rideshare trips. By comparing that information with about Chicago's demographic makeup, the team discovered that riders were charged more per mile when they were traveling to or from neighborhoods with high percentages of nonwhite or low-income residents.

"It means people in these neighborhoods that are already disadvantaged are being further disadvantaged by having to pay more for their rides," Dr. Caliskan said.

The culprit isn't conscious prejudice on the part of individual drivers or the companies themselves, Dr. Caliskan said. Rather, it's societal built into the decision-making that drives rideshare pricing.

Uber and Lyft have proprietary machine learning algorithms that individualize all pricing for riders, a practice called dynamic pricing. Rideshare prices vary based on time, location, traffic conditions and other unknown properties. No two ridesharers, even when requesting a ride milliseconds apart in the same location, are likely to be charged the exact same price.

In general, Dr. Caliskan found, speed and demand caused the biggest fluctuations in price—hence "surge pricing," the bane of the late-to-work commuter. But the demographic makeup of a neighborhood also had a statistically significant impact.

"Given a type of data, AI algorithms learn the patterns in that data—so if you give an algorithm data from the social domain, they end up learning the patterns of society," Dr. Caliskan said. "Since our society is biased, these models end up learning these biases as well. Then, when they're used in the to make decisions about human beings, they use that biased data to make decisions that not only perpetuates societal biases but even amplifies them."

Dr. Caliskan has been interested throughout her career in the relatively new science of algorithmic bias. She has studied the way language processing algorithms adopt—and then amplify—societal prejudice, for instance by associating women more with home and the arts and men with careers and the sciences. It's a to solve, she said, because AI tools have to be trained on existing content created by humans, and that content necessarily reflects the conscious and unconscious bias of its authors. Because the AI can't distinguish between biased and nonbiased linguistic input, it reflects those biases in its output.

Correcting those biases in an imperfect society is a complicated question, Dr. Caliskan said. One possibility is to develop closer regulation of decision-making algorithms, which are largely secretive in the data they collect and how it is used, even as they have larger-scale effects on human life—not just in the price a person pays for a Lyft but in her likelihood of getting a house, being admitted to college or having a dangerous cancer detected. Studying datasets like the one released by Chicago can help track the AI's biases, which might then be corrected by programming changes or human ethical checkpoints.

"We need to assume we are dealing with biases and come up with methods and tools for mitigation," Dr. Caliskan said. "But it's quite complex, and it seems like this will be an increasingly important question."

More information: Pandey et al., Iterative Effect-Size Bias in Ridehailing: Measuring Social Bias in Dynamic Pricing of 100 Million Rides. arXiv:2006.04599v4 [cs.CY]. arxiv.org/pdf/2006.04599.pdf

Citation: Uber and Lyft overcharge riders going to and from disadvantaged areas (2020, July 8) retrieved 18 April 2024 from https://techxplore.com/news/2020-07-uber-lyft-overcharge-riders-disadvantaged.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

New research finds racial bias in rideshare platforms

6 shares

Feedback to editors