new generation cobots.

In the recent preprint study, researchers at the Eindhoven University of Technology, DePaul University, and the University of Colorado Boulder find evidence of bias in recommender systems like those surfacing movies on streaming websites. They say that as users act on recommendations and their actions are added to the systems (a process known as a feedback loop), biases become amplified, leading to other problems like declines in aggregate diversity, shifts in representations of taste, and homogenization of the user experience.

Collaborative filtering is a technique that leverages historical data about interactions between users and items for example, TV show user ratings to generate personalized recommendations. But recommendations provided by CF generally suffer from bias against certain user or item groups, usually arising from biases in the input data and algorithmic bias.

It’s the researchers’ assertion that bias could be intensified over time when users interact with the recommendations. To test this theory, they simulated the recommendation process by iteratively generating recommendation lists and updating users’ profiles by adding items from those lists based on an acceptance probability. Bias was modelled with a function that took into account the percent increase of the popularity of recommendations compared with that of ratings provided by users on different items.

Experiments:

In experiments, the researchers analysed the performance of recommender systems on the MovieLens data set. A corpus of over 1 million movie ratings collected by the GroupLens research group. Even in the case of an algorithm that recommended the most popular movies to everyone. Accounting for movies already seen, amplified bias caused it to deviate from users’ preferences over time. The recommendations tended to be either more diverse than what users were interested in or over-concentrated on a few items. More problematically, the recommendations showed evidence of “strong” homogenization. Over time, because the MovieLens data set contains more ratings from male than female users. The algorithms caused female user profiles to edge closer to the male-dominated population. Resulting in recommendations that deviated from female users’ preferences.

Like the co-authors of another study on biased recommender systems, the researchers suggest potential solutions to the problem. They suggest using strategies for user grouping based on average. Profile size and popularity of rated items and different algorithms that control for popularity bias. They also advocate not restricting the regrading of items already in users’ profiles, instead updating them in each iteration.

The influence of feedback loop is generally stronger for the users who belong to the minority group,” the researchers wrote. “These results emphasize the importance of the algorithmic solutions to tackle popularity. Bias and increasing diversity in the recommendations since even a small bias. In the current state of a recommender system could be greatly amplified over time if it is not addressed properly.