Study: Uber and Lyft Charge More in Non-White Areas

REUTERS/ROBERT GALBRAITH
REUTERS/ROBERT GALBRAITH

According to a recent study analyzing the pricing algorithms of ride-sharing services such as Uber and Lyft, prices for rides to or from non-white areas are higher, leading to accusations of racial bias.

NewScientist reports that algorithms used by ride-sharing companies such as Uber and Lyft to determine pricing for particular areas are being accused of racial bias. An analysis of transport and census data in Chicago by Aylin Caliskan and Akshat Pandey from George Washington University in Washington DC found that ride-sharing companies charge a higher price per mile for a trip if the pick-up point or destination is in a neighborhood with a higher proportion of ethnic minority residents.

Caliskan stated: “Basically, if you’re going to a neighborhood where there’s a large African-American population, you’re going to pay a higher fare price for your ride.” Ride-sharing services have dynamic fares meaning that the price of a trip is calculated based on a number of factors including the length of the trip and local demand.

Researchers analyzed data from over 100 million trips taken in Chicago through ride-sharing apps between November 2018 and December 2019. The rides contained information such as pick-up and drop-off locations, duration, cost, and whether the ride was an individual or shared trip. 68 million trips were made by individuals riders during this period.

Trip data was then compared against information from the Census Bureau’s American Community Survey which provides aggregate statistics about neighborhoods such as population, ethnicity, and education levels. The analysis found that prices per mile were higher on average if the trip pick-up or drop-off location was in a neighborhood with a lower number of white residents, lower median house price, or a lower average educational attainment.

Os Keyes at the University of Washington in Seattle commented: “Even in the absence of identity being explicitly considered in how an algorithm’s results are decided, the structural and historical nature of racism and the way that it informs geography, opportunity and life chances mean that racial disparities can still appear.”

Keyes continued: “Chicago, the site of this analysis, is a case in point: as a result of – amongst other things – redlining practices, it remains highly geographically segregated… This should cause us to further question studies of ‘fairness’ and ‘bias’ in algorithms which promise to end algorithmic racism by simply not mentioning race.”

A Lyft spokesperson commented on the study, stating:

We recognize that systemic biases are deeply rooted in society, and appreciate studies like this that look to understand where technology can unintentionally discriminate. There are many factors that go into pricing – time of day, trip purposes, and more – and it doesn’t appear that this study takes these into account. We are eager to review the full results when they are published to help us continue to prioritize equity in our technology.

Noel Sharkey at the University of Sheffield, U.K., commented: “This study shows how algorithmic bias by postcode and race can creep into even the most unexpected places. It is yet another example in a long list of how ethnicity and race bias has found a new home in computer software. There is no excuse for automation biases and such systems should be shut down until such time as they can demonstrate fairness and equality.”

Read more at NewScientist here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address lucasnolan@protonmail.com

COMMENTS

Please let us know if you're having issues with commenting.