Poster
in
Workshop: Algorithmic Fairness through the Lens of Time
Fairness in link analysis ranking algorithms
Ana-Andreea Stoica · Augustin Chaintreau · Nelly Litvak
In this paper, we study the problem of fairness in link analysis algorithms in evolving networks. In particular, we formally show that minority groups can get under-represented in ranking algorithms such as HITS and Pagerank, in networks that evolve over time. We show that under-representation does not come out of nowhere, but biased networks can create even more biased rankings: we use an evolving network model with multiple communities to show that homophily plays a central role in amplifying bias against minority groups in rankings based on HITS. We derive a theoretical approximation to show that bias increases in more homophilic networks, showing that the authority scores resulting from applying the HITS algorithm effectively push minorities even further down in the ranking as compared to the degree ranking. The use of evolving networks is particularly important in two ways: (1) to show that such algorithms do not get deployed on static content, but on ever-evolving nodes and links that have a temporal aspect; (2) the scores that link analysis algorithms output are often used as features in learning-to-rank algorithms, implying that biased features will have a lasting effect on the fairness of many ranking schemes. We illustrate our theoretical analysis on both synthetic and real datasets.