SHORT NEWS

Deep learning results in particularly sexist AI systems

Decisions made by machines based on AI have already been shown to be discriminatory or racist. This is due to the fact that the algorithms draw on data originating from humans, and this data is frequently biased – among other things with sexist prejudices.

Male nurse taking care of a patient

Researchers from the Austrian city of Linz have found that the results of search engines that use deep learning exhibit an exceptionally strong sexist bias.

Discrimination by algorithms has been documented in numerous studies. People with an immigrant background, for example, are given lower credit ratings, are more frequently suspected by police computers, or are shown advertisements for inferior jobs or housing by social media platforms. Microsoft had to take its chatbot software offline after just a short period because users induced it to deny the Holocaust and insult people of color.

In their study, Navid Rekab-Saz and Markus Schedl from the Institute for Computational Perception at the University of Linz and the Human-Centered AI Group of the Linz Institute of Technology (LIT) analyzed AI laboratory models and algorithms that are used by search engines – based on real-life search queries.

CEO = male, nurse = female

In their paper, the researchers divided search queries into two groups: On the one hand, those that are gender-neutral, which provided answers without any gender bias. As an example for this group, the researchers cite the search query for Roman emperors. The result only included men, but since only men ever made it to the emperor’s throne, the answer was not biased.

The other group consisted of queries that – at least in English – are not explicitly gender-specific, such as the query about the income of a nurse (in contrast to English, many languages use gender-specific nouns) or the query searching for a synonym for “beautiful”.

Although these questions are gender-neutral, the search engines mainly returned answers relating to women, with results relating to men only ranking far behind. Conversely, for example, the search for “CEO”, i.e. the head of a company, or “programmer” resulted in answers with predominantly male connotations.

Deep learning exacerbates discrimination

It has been demonstrated “that it is precisely the latest generation of deep learning algorithms that results in a particularly pronounced gender bias,” Navid Rekab-Saz told the APA news agency. This has considerable ramifications, since such algorithms have recently been adopted by the most popular search engines – Google and Bing.

According to the two researchers, the bias is due to the fact that systems that use deep learning not only search for the search term itself – such as “nurse” or “CEO” – but also for similar terms or subject areas. In the case of “nurse”, for example, the search engine also included the search term “matron”. As a result, they increasingly lean towards a female interpretation of the search query.

The underlying reason is that the data on which the AI is based is compiled and processed by humans and hence inherently contains these biases. “The AI’s search only amplifies this effect”, Navid Rekab-Saz explains. In their study, the researchers examined gender bias, but they are certain that such effects also occur with regard to other characteristics, such as age, ethnicity, or religion.

The problem is not the AI – it’s the programming

For the researchers, their paper’s findings are no reason to categorically reject AI, which they refer to as an “enormously valuable tool”. The goal of their research group is rather to raise awareness of the bias of AI results, which are caused by human prejudices, and to take this into account when programming algorithms.

“Deep learning is a tool with two sides: On the one hand, it can reinforce a certain bias, as the study’s findings show. On the other, it offers such a high degree of flexibility that it allows us to design more sophisticated models that explicitly avoid such biases,” Navid Rekab-Saz emphasized.

Written by: sda

Photos: keystone

Read more