International Women’s Day: Why Many Algorithms Are Sexist

International Women’s Day: Why Many Algorithms Are Sexist

The facts date back to 2014. The e-commerce giant Amazon, which already claims to be an expert in computerized logistics optimization, wants to apply the same recipe in its human resources department. The firm then decides to secretly test software based on artificial intelligence. making it possible to sort “the good CVs” from the “bad” in record time. “Everyone wanted this ‘holy grail’ […]. Amazon wanted it to be an engine in which you put 100 CVs and it automatically brings out the top five for their recruitment,” a source familiar with the matter told Reuters at the time.

But the experiment runs out just as discreetly. The problem is indeed sizeable: its software, which works based on AI and algorithms, turns out to be sexist, almost systematically rejecting the CVs of women for developer positions or positions related to technical skills, among the most wanted at Amazon .

The reason ? The multinational’s program is based on a database of CVs received over a ten-year period, which are in fact mostly male. Despite a technical adjustment to the software, the problem persists, forcing the e-commerce giant to completely abandon the project in early 2017.

The digital world full of biased algorithms

According to Aurélie Jean, doctor of science and author of several books on the ethics of algorithms, this story reveals a problem that is still relevant today: “The digital tools that are created today, when they are poorly developed and poorly tested, can actually generate technological discrimination based on gender, by treating women unfairly”.

Same conclusive observation from journalist Mathilde Saliou, author of the investigative book published this month “Technofeminism: How digital technology aggravates inequalities”. According to his research, sexism runs through the entire digital world.

“The facial recognition algorithms of phones recognize women’s faces less well than those of men. […] “, cites as an emblematic example the one who is also a journalist at the online media “Next INpact”. She adds: “It means that despite an equal purchase price, the smartphone does not work as well according to our profile. »

Another striking example: advertising algorithms. “It has been proven that at LinkedIn and Facebook, offers for positions of responsibility are less available to women,” she says.

We find the same logic in the image recognition “algos”: “If we submit a data set to them in which there are 33% more women represented in the kitchen, it will not only restore these stereotyped roles, but will even amplify them and produce 68% images of women in the kitchen. »

Behind the algorithm, humans and their biases

How to explain that a computer code, apparently neutral, induces such discriminations? Response from Aurélie Jean, who alongside her research, runs In Silico Veritas, a consulting agency in the field: “An algorithmic model can be explicit, that is to say that its operating logic is chosen by the designers. It can also be implicit and in this case the logic is defined implicitly by training on datasets. »

Mathilde Saliou continues the explanation: “It must be understood that an algorithm is a tool built by humans, and therefore a tool impregnated with errors and biases (sexist, racist, etc.), however brilliant the people who use it have designed. However, men who have gone to code or engineering school are overrepresented among developers. Their social ****geneity means that they share the same oversights, have the same blind spots. »

If we follow this logic, the problem with Amazon’s software could have been avoided if its designers had previously statistically tested the data or the responses of their algorithm. Particular vigilance, which according to the researcher, is more frequent when there is real diversity within a software team. In particular the presence of women trained on the subject.

Ultimately, the discriminations found in the digital world are a reflection of the evils found in society.

Mathilde Saliou Digital journalist

In their book “Artificial Intelligence, not without them”, published in March 2019, Aude Bernheim and Flora Vincent, doctors of science and co-founders of the feminist association WAX Science, identified the same cause. “When the data is biased, the algorithm reproduces the sexist stereotypes of our society”, explained to “Echos start” Aude Bernheim.

According to her, Google’s translation software is based on this same logic. “If the word ‘engineer’ appears 1,000 times in the references, and the majority of them are men, the algorithm will assign a masculine gender to this profession”, underlines the researcher.

Threat to democracy

“Ultimately, discrimination in the digital world reflects the evils found in society,” summarizes Mathilde Saliou. Thus, in the same way as sexism, other forms of discrimination, such as racism or ****phobia, are also rampant throughout the digital space. “And that represents a real democratic threat “, underlines the specialized journalist.

“Women are 27 times more likely to be targeted by cyberbullying than men, according to the European Women’s Lobby. Campaigns of violence are dangerous for public debate and democracy because they contribute to the silencing of minorities: women, but also black people, LGBT+ , etc. “Warns Mathilde Saliou.

Tech companies on the front line to act

Charge to tech companies to act as the main actors of the subject. Aurélie Jean agrees: “I see that they are doing it more and more, for ethical, economic and soon legal reasons. Ethically, because discrimination of any type is unacceptable. Economically, because excluding some individuals because of technological discrimination means excluding users and therefore potential income. Finally, legally with regard to upcoming European legislation on the development and use of artificial intelligence technologies. »

And the researcher recommends sharing the following best practices: having diverse profiles within software teams, making the operation of its algorithms transparent, testing and optimizing its data to avoid bias or even having rigorous algorithmic governance.

“The plurality of visions makes it possible to enrich what we create together. As I often say, we must build algorithms for everyone and by everyone,” concludes the scientist, who in 2019 was ranked among the 40 most influential French women in the world by “Forbes” magazine.

Source link

Leave a Reply