How dating app algorithms contribute to racial bias

Nikki Chapman remembers finding her now-husband through online dating website Plenty of Fish in 2008. Kay Chapman had sent her a message.

"I looked at his profile and thought he was really cute," Nikki Chapman said. "He asked me who my favourite Power Ranger was, and that is what made me respond to him. I thought that was kind of cool – it was something that was near and dear to me from when I was a kid." The Ilinois couple now have two kids of their own: son Liam is 7, and daughter Abie is 1.

Dating app algorithms can contribute to racial bias.

Dating app algorithms can contribute to racial bias.Credit:Shutterstock

Looking back, Chapman recalls the dating site asking about race, which she doesn't think should matter when it comes to compatibility. It didn't for her; she is white, and Kay is African-American.

"Somebody has to be open-minded in order to accept somebody into their lives, and unfortunately not everybody is," she said.

Researchers at Cornell University looked to decode dating app bias in their recent paper, Debiasing Desire: Addressing Bias and Discrimination on Intimate Platforms.

In it, they argue dating apps that let users filter their searches by race – or rely on algorithms that pair up people of the same race – reinforce racial divisions and biases. They said existing algorithms can be tweaked in a way that makes race a less important factor and helps users branch out from what they typically look for.

"There's a lot of evidence that says people don't actually know what they want as much as they think they do, and that intimate preferences are really dynamic, and they can be changed by all types of factors, including how people are presented to you on a dating site," said Jessie Taft, a research coordinator at Cornell Tech. "There's a lot of potential there for more imagination, introducing more serendipity and designing these platforms in a way that encourages exploration rather than just sort of encouraging people to do what they would normally already do."

Taft and his team downloaded the 25 most popular dating apps (based on number of iOS installs as of 2017). It included apps like OKCupid, Grindr, Tinder and Coffee Meets Bagel. They looked at the apps' terms of service, their sorting and filtering features, and their matching algorithms – all to see how design and functionality decisions could affect bias against people of marginalised groups.

They found that matching algorithms are often programmed in ways that define a "good match" based on previous "good matches". In other words, if a user had several good Caucasian matches in the past, the algorithm is more likely to suggest Caucasian people as "good matches" in the future.

Algorithms also often take data from past users to make decisions about future users – in a sense, making the same decision over and over again. Taft argues that's harmful because it entrenches those norms. If past users made discriminatory decisions, the algorithm will continue on the same, biased trajectory.

"When somebody gets to filter out a whole class of people because they happen to check the box that says (they're) some race, that completely eliminates that you even see them as potential matches. You just see them as a hindrance to be filtered out, and we want to make sure that everybody gets seen as a person rather than as an obstacle," Taft said.

"There's more design theory research that says we can use design to have pro-social outcomes that make people's lives better than just sort of letting the status quo stand as it is."

Other data shows that racial disparities exist in online dating. A 2014 study by dating website OKCupid found that black women received the fewest messages of all of its users. According to Christian Rudder, OKCupid co-founder, Asian men had a similar experience. And a 2013 study published in the Proceedings of the National Academy of Sciences revealed that users were more likely to respond to a romantic message sent by someone of a different race than they were to initiate contact with someone of a different race.

Taft said that when users raise these issues to dating platforms, companies often respond by saying it's simply what users want.

"When what most users want is to dehumanise a small group of users, then the answer to that issue is not to rely on what most users want… Listen to that small group of individuals who are being discriminated against, and try to think of a way to help them use the platform in a way that ensures that they get equal access to all of the benefits that intimate life entails," Taft said. "We want them to be treated equitably, and often the way to do that is not just to do what everybody thinks is most convenient."

He said dating sites and apps are making progress – some have revamped their community guidelines to explicitly state that their site is a discrimination-free zone (users who use hateful messaging are then banned). Others are keeping the race/ethnicity filter but also adding new categories by which to sort. Taft hopes the people making design decisions will read his team's paper and at least keep the conversation going.

"There's a lot of options out there," Nikki Chapman said. "I remember filling out on an app, 'What hair colour are you interested in? What income level? What level of education?' If you're going to be that specific, then you need to go build a doll or something because life and love doesn't work like that."

Chicago Tribune

Source: Read Full Article