🔬 Research summary by Karim Nader, a graduate student at the University of Texas at Austin whose research focuses on the ethics of information & technology.
[Original paper by Karim Nader]
Overview: This essay explores ethical considerations that might arise from the use of collaborative filtering algorithms on dating apps. Collaborative filtering algorithms learn from behavior patterns of users generally to predict preferences and build recommendations for a target user. But since users on dating apps show deep racial bias in their own preferences, collaborative filtering can exacerbate biased sexual and romantic behavior. Maybe something as intimate as sexual and romantic preferences should not be the subject of algorithmic control.
Dating apps have allowed people from extremely different backgrounds to connect and are often credited with the rise of interacial marriage in the United States. However, people of color still experience substantial harassment from other users that can include racial generalizations or even fetishization. This bias can extend from the users to the algorithm that filters and recommends potential romantic and sexual partners. Dating apps algorithms are built to predict the intimate preferences of a target user and recommend profiles to them accordingly, but biased data leads to biased recommendations.
This research establishes that the data that is fed to the algorithm on dating apps reflects deep racial bias and that dating apps can perpetuate this bias in its own recommendations. Further, since recommendations are extremely effective at altering user behavior, dating apps are influencing the intimate behaviors of their users. A look into the philosophy of desires further complicates the issue: intimate biases are often seen merely as personal preferences. But since users have little control over algorithmic filtering, dating apps can come between users and their romantic and sexual autonomy.
Collaborative filtering works by predicting the behavior of one target user by comparing it to the behavior of other users around them. For example, if a majority of users who buy chips also buy salsa, the algorithm will learn to recommend salsa to anyone who buys chips. This way, filtering algorithms can build recommendations that reflect general patterns of behavior. And it turns out that they are highly effective at doing it! However, collaborative filtering has a tendency to homogenize the behavior of users on a platform without necessarily increasing utility. Moreover, studies on YouTube’s recommender system show that, through algorithmic recommendation, reasonable searches can quickly lead a user to videos that promote conspiracy theories. Algorithmic filtering can thus normalize problematic patterns of behavior through gradual technological nudges and pressures. Is the same true of dating apps? To show that, we’d have to establish that dating app users themselves are feeding the algorithm biased data through their activity.
Race and online dating
Christian Rudder, founder of OkCupid, explains that match scores (OkCupid’s compatibility score which is calculated by an algorithm) are the best way to predict a user’s race. In other words, the match scores of users of different races will show patterns that are distinct enough that we can identify the race of the profile simply by seeing which profiles the algorithm believes is a good match to them. Again, algorithms learn from user data so what kind of data is leading to this kind of racial algorithmic bias on dating apps? Well, it turns out that dating app users show distinct patterns of preference when it comes to race. Several empirical studies confirm those trends: users on online dating platforms seem to segregate themselves based on race and so, prefer people of their own race. Most users exclude people of color from consideration, except those of their own race, and generally show a preference for white men and women. People of color are more likely to include the profiles of white users for consideration, but white people are not as likely to include the profiles of people of color. Since correlations lead to recommendations, users on dating apps will be recommended to other users of their own race and will receive more recommendations for white users.
Shaping sexual and romantic preferences
Now, we’ve established that the algorithm behind dating apps can exacerbate some kind of racial bias. The problem is that it is not clear if this is a problem that needs to be addressed. Surely the Spotify algorithm favors some artists over others, but when it comes to personal taste like music, bias is simply a preference. Sexual and romantic biases might similarly be simple preferences. However, sexual and romantic biases reflect larger patterns of discrimination and exclusion that are grounded in a history of racism and fetishization. And so, there might be some justification for us to raise a moral objection to the use of collaborative filtering on dating apps. After all, recommendations can and do change the behavior and preferences of users. Studies show that if two people are told they are a good match, they will act as if they are regardless of whether or not they are truly compatible with each other. Regardless, the issue might be that users have absolutely no control over the filtering that determines who they see on dating apps. Explicitly stated preferences are sometimes overridden by algorithmic predictions. Using collaborative data in the context of dating apps seems to undermine extremely personal sexual and romantic desires that should not be ‘predicted’ by an algorithm.
Between the lines
Most of the research on dating platforms has focused on dating websites that allow users to browse through a collection of profiles with little to no algorithmic intervention. However, dating platforms have evolved substantially and algorithmic suggestions play a powerful role in the experience of dating app users. This research brings attention to the reach of algorithmic bias on platforms that researchers often overlook.
While people of color anecdotally report lower success rates and occasional harassment and fetishization, those concerns are not taken seriously because personal romantic preferences are seen to be outside of the realm of moral evaluation. Philosophers and moral experts need to pay closer attention to biases that evade ethical scrutiny in this way.
While this research is an important step towards bringing race, romance and attraction into discussions of algorithmic bias, it is merely a conceptual, philosophical and ethical analysis of the question and more empirical work needs to go into understanding the algorithms behind dating apps and the experience of users on those platforms.