By Donna Lu
The algorithms that ride-hailing companies, such as Uber and Lyft, use to determine fares appear to create a racial bias.
By analysing transport and census data in Chicago, Aylin Caliskan and Akshat Pandey at The George Washington University in Washington DC have found that ride-hailing companies charge a higher price per mile for a trip if the pick-up point or destination is a neighbourhood with a higher proportion of ethnic minority residents than for those with predominantly white residents.
“Basically, if you’re going to a neighbourhood where there’s a large African-American population, you’re going to pay a higher fare price for your ride,” says Caliskan.
Advertisement
Unlike traditional taxis, ride-hailing services have dynamic fares, which are calculated based on factors including the length of the trip as well as local demand – although it is unclear what other factors these algorithms take into consideration because ride-hailing companies don’t make all of their data available.
The researchers analysed data from more than 100 million trips taken in Chicago through ride-hailing apps between November 2018 and December 2019. Each ride contained information including pick-up and drop-off location, duration, cost and whether the ride was an individual or shared trip. The data doesn’t include demographic details such as the ethnicity of the rider.
In that period, 68 million trips were made by individual riders, and the majority of these used Uber.
The duo compared the trip data against information from the US Census Bureau’s American Community Survey, which provides aggregate statistics about neighbourhoods, including population, ethnicity breakdown, education levels and median house prices.
They found that prices per mile were higher on average if the trip pick-up or drop-off location was in a neighbourhood with a lower proportion of white residents, a lower median house price, or lower average educational attainment.
“Even in the absence of identity being explicitly considered in how an algorithm’s results are decided, the structural and historical nature of racism and the way that it informs geography, opportunity and life chances mean that racial disparities can still appear,” says Os Keyes at the University of Washington in Seattle.
“Chicago, the site of this analysis, is a case in point: as a result of – amongst other things – redlining practices, it remains highly geographically segregated,” says Keyes. Redlining is practice in which mortgage lenders refuse to offer loans in certain neighbourhoods.
“This should cause us to further question studies of ‘fairness’ and ‘bias’ in algorithms which promise to end algorithmic racism by simply not mentioning race,” says Keys.
The researchers found no statistical link to suggest that neighbourhoods with higher proportions of ethnic minorities had higher demand for rides, which could potentially explain the higher fare prices.
“We recognise that systemic biases are deeply rooted in society, and appreciate studies like this that look to understand where technology can unintentionally discriminate,” said a Lyft spokesperson. “There are many factors that go into pricing – time of day, trip purposes, and more – and it doesn’t appear that this study takes these into account. We are eager to review the full results when they are published to help us continue to prioritise equity in our technology.”
Uber did not respond to a request for comment before publication.
Under US law, it is illegal to discriminate against an individual on the basis of protected attributes, including race. The study’s findings are problematic, says Caliskan. “Even though these algorithms are supposed to be fair and they are not using protected attributes, they seem to have a significant impact on these neighbourhoods.”
“This study shows how algorithmic bias by postcode and race can creep into even the most unexpected places,” says Noel Sharkey at the University of Sheffield, UK. “It is yet another example in a long list of how ethnicity and race bias has found a new home in computer software. There is no excuse for automation biases and such systems should be shut down until such time as they can demonstrate fairness and equality,” he adds.
Reference: arxiv.org/abs/2006.04599
More on these topics: