Can software cure Uber’s racial bias when humans are to blame?


In 2017, racial discrimination still overshadows our country and weighs us down as a nation. But according to one expert, there may be a cure in combining education with data.

As our culture evolves alongside technical advancements, a different kind of bias rears its ugly head in online transactions driving the sharing economy. Recent research into Uber Technologies Inc. and Lyft Inc. exposed the inequity a person can experience with online transportation network services based on race.

The study conducted in Seattle and Boston revealed that each company’s drivers demonstrated a bias against black customers. “Both companies reached out to us quickly to do follow-up studies. These companies want to do better,” said Chris Knittel (pictured, right), professor of applied economics at Massachusetts Institute of Technology. Both Uber and Lyft put out statements stating that they do not tolerate any forms of discrimination.

During an interview at the MIT Expert Series 2017 held in Cambridge, Massachusetts, Knittel sat down with Rebecca Knight (left), host of theCUBE, SiliconANGLE Media Inc.’s mobile video studio, to talk about his working paper titled “Racial and Gender Discrimination in Transportation Network Companies.”

The state of discrimination: 1,500 trips

Both studies maintained defined parameters: The control environment included research assistants made up of two African-American females, two African-American males, two white females and two white males who made approximately 1,500 trips using the apps.

Moreover, all research assistants were going from the same point A-to-point B destinations. The conditions tracked were how long it took for a driver to “accept” the transaction and the amount of time it took a driver to pick up the passenger.

It’s important to note, the committee overseeing the research did not allow the identity of the driver’s race as they were concerned for potential penalties if Uber or Lyft tried to identify the drivers. However, future studies may include this information.

The Seattle study revealed that it took the black researchers 20 percent longer than their white counterparts to have their rides accepted. When it came to wait-time, again the black researchers waited up to 35 percent longer than the white researchers when using Uber.

As for Lyft, the wait times were about the same due to a difference in the platforms. “The Lyft driver sees the [client’s] name before they accept the transaction; in the case of the Uber driver, they see the name after it accepts the transaction,” explained Knittel.

The Boston experiment had an additional element using names that were commonly white-sounding or black-sounding names based on the city’s birth records from the 1970s. The result was that the black-sounding names received cancellations at twice the rate of the white-sounding names chosen for the study. Additionally, males with black-sounding names were canceled by UberX three times more than those using white-sounding names.

The research highlighted the disparity between the two races, but bias was reported on gender as well. In Boston it was noted that women went on longer rides from point A to point B. Knittel attributed the extra time to financial motives, with the driver trying to extract a higher fair, and the other to being of a social nature, with the driver trying to score a date.

Mitigating discrimination through platform design

During the interview, Eva Millona, executive director of the Massachusetts Immigrant and Refugee Advocacy Coalition (also known as the MIRA Coalition), appeared in a video clip to discuss the same inequities that immigrants face.

“Given the impressive sample of the research, [it] leads to [the conclusion] that such discrimination is still out there,” Millona shared.

She believes it’s going to take political leadership to set the right tone along with all sectors making an effort to create a better society for everyone. “Uber and Lyft have an opportunity to provide leadership and come up with a promotion of policies that are welcoming to the newcomers [and] provide education and training,” asserted Millona.

From a technology standpoint, Knittel discussed how autonomous vehicles “would take the human element out of things,” mentioning that the drivers are deciding to discriminate. “So providing you didn’t write the autonomous vehicle software to discriminate, you would know for sure that car is not going to discriminate,” he said.

There are other ways to tweak the technology. “Getting rid of names could be one way Uber and Lyft could eliminate discrimination,” Knittel remarked. He also offered that delaying the time a driver sees the name would add more of a commitment to picking up a person.

Information will ultimately eliminate discrimination. Suggesting a campaign for drivers that represented the tip rates and the bad ride rates for different ethnicities, Knittel hopes to help drivers internalize the differences among cultures and stop the discrimination.

Solving the problem can come from the government in the form of fines. However, discrimination is already against the law. So, it will most likely be resolved by the industry. Due to the study, Uber and Lyft are incentivized to fix the problem, and there are steps in progress.

The startup community contributing to the sharing economy can also become part of the solution. Knittel provides the following advice for the industry: One is to be aware that discrimination happens, and, secondly, he offers the findings of the research as a head start in designing their platforms to limit discrimination.

As for future studies, Knittel revealed that they are in the works. “I think we can do better for sure, and I would say we need more studies like we just performed to see how widespread [the problem] is. We only used two cities,” he clarified.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE and theCUBE’s coverage of the MIT Expert Series 2017.

Photo by SiliconANGLE