Dating app Breeze obligated to prevent discrimination within algorithm

INTRODUCTION

Recently, the Netherlands Institute for Human Rights ruled on the dating app Breeze and the algorithm behind it. On its own initiative, Breeze had asked the institute whether they should be allowed to adjust their algorithm so that people with dark skin colour and of non-Dutch origin are introduced to other users as often as people with light skin colour and of Dutch origin. According to the Institute for Human Rights, this is not only allowed but also obligatory. What is interesting here is in order to do so, Breeze will most likely have to process sensitive personal data such as ethnicity. And this is in principle not allowed.

CASE INFORMATION

Breeze's algorithm:

  • is self-learning and introduces users to each other based on a "matching probability," which indicates how likely it is that two users would like each other;
  • works fully automatically, without human intervention;
  • uses information from user profiles and users' "like”-behaviour as input, but the exact impact of this like behaviour on the matching probability is unknown. It is suspected that the algorithm considers users' preferences or “like”-behaviour when calculating this probability.

Users of the app have noted that the suggested candidates show little diversity. Breeze suspects that users with darker skin colour or non-Dutch origin may use the app less because of limited matches, suggesting that the algorithm calculates lower matching probabilities on average for this group.

According to Breeze, scientific articles indicate that people may be more likely to prefer partners with similar ethnic backgrounds when looking for a date or partner. Breeze suspects that its algorithm not only picks up on these preferences but also reinforces them. Despite Breeze's efforts to promote diversity, its user base remains mostly homogeneous, with mostly users of light skin colour and Dutch descent. Therefore, Breeze wants to change its algorithm to create more diversity in the app. They want to propose people with dark skin colour and of non-Dutch origin to other users as often as people with light skin colour and of Dutch origin.

THE INSTITUTE'S VERDICT

The institute finds that the algorithm is likely to discriminate because the app is less likely to suggest people with a dark skin colour or a non-Dutch background to other users. While personal preferences for certain ethnic backgrounds in dating are not necessarily discriminatory, even taking these preferences into account, the matching score of people with dark skin colour or non-Dutch origin should presumably be higher than it currently is. This creates indirect discrimination.

According to the institute, Breeze is not only authorised to adjust the algorithm but also has an obligation to do so, as discrimination is prohibited. Even if the algorithm is responsible for these effects and its exact operation is not fully understood by Breeze. Breeze bears responsibility for the algorithm used and the discriminatory effects resulting from it.

The institute does not see these measures to prevent indirect discrimination as preferential policies (meaning that, in case of equal suitability, someone with a specific ethnic origin would be favoured over others without that origin). These measures would not give a privileged position to users with a dark skin colour or non-Dutch origin but would only serve to compensate for impending disadvantage. As a result, these users would not be presented more often than others, but rather equally often.

ADJUSTING THE ALGORITHM

GDPR

In adjusting the algorithm, Breeze faces the challenge of doing so without processing sensitive personal data such as origin or skin colour, in line with the General Data Protection Regulation (GDPR). Because, in principle, the GDPR prohibits the use of sensitive personal data, such as ethnicity. However, under certain safeguards, sensitive personal data, such as ethnicity, can be processed.

For example, you could argue that the processing is necessary for reasons of substantial public interest. The Dutch implementation law of the GDPR (Uitvoeringswet AVG) states that it is permissible to process ethnicity in some cases to implement a temporary preference policy for underrepresented groups. However, it is doubtful that this would succeed in this case.

This makes it complex to establish disparities between user groups. In its verdict, the institute did not consider how you could, in fact, GDPR-proof adjust the algorithm.

AI Act

To some extent, in strictly necessary cases, the proposed AI Act allows high-risk AI providers to process special categories of personal data under specific conditions. However, the proposed exception is not sufficiently linked to the safeguards under the GDPR and is not specific enough to function as an effective legal basis. Even if the AI Act was finalized and had entered into force, it would not give a sufficient legal basis to tackle this problem ethically.

IAMA

To minimise risks, algorithm developers may consider conducting Human Rights and Algorithms Impact Assessments (IAMA in Dutch). This way, developers can test the compliance of their algorithms with human rights during their design phase and thus prevent this kind of hiccup within their algorithm. With an IAMA, human rights risks can be assessed, and you can make necessary adjustments to the algorithm. We offer guidance on conducting an IAMA.

OTHER CASES

The issue of discrimination prevention is not unique to Breeze, it applies to many other algorithm developers. Are you facing the same problem and do you want tailored advice on how to adjust your algorithm ethically? Don't hesitate to contact us or wijst@bg.legal

 

Details
More questions?

If you were not able to find an answer to your question, contact us via our member-only helpdesk or our contact page.

Recent Articles