The Global Gender Gap 2004 report by the World Economic Forum estimates that it will take 134 years before gender equality is achieved, and this estimate is worse than the 2023 estimate of 131 years.
There is a risk that the day of gender equality will not come any closer but will in fact become even more distant if the world of artificial intelligence, especially generative AI, continues to not only reproduce but also multiply gender stereotypes.
The latest example is the new GPT-4o version of OpenAI, and it does not inspire optimism. This system has been given a voice that closely resembles that of Scarlett Johansson, who played the role of the operating system in the 2013 film ‘Her’. The company claimed that Sky’s synthetic voice was not modelled on the artist, but then they did remove it as a precautionary measure. The matter will be debated in the US Congress, which has summoned the actress to testify.
Beyond this particular case, there has been controversy for some time over the issue of female voices and female names being given to voice assistants such as Apple’s Siri and Amazon’s Alexa. Although a male version exists in some cases, it is not the default option. A 2019 UNESCO report condemned how this model perpetuates the image of a servile, docile, woman who is always available and acts on command.
Some positive steps have actually been taken in the world of voice assistants. On the very occasion of this year’s International Women’s Day, Amazon, in collaboration with ActionAid, introduced a modification. In response to certain insults and offenses, Alexa does not remain silent. It reacts to verbal violence in a sarcastic voice, asking users to be more humane and less degrading. The initiative does not stop there. Simply by saying “Alexa, speak up“, users can proactively choose to listen to information and insights about verbal violence.

In contrast, GPT-4o takes us back to the idea of a woman in a highly stereotyped and sexually connotated role, ready to respond to commands with a flirtatious, seductive voice.
Sam Altman, the CEO of OpenAI, does not seem to care much about the debate on gender bias in AI, and neither does the company’s management most probably. Recall that of the 702 (out of 750) employees who signed a letter at the end of last year calling for Sam Altman’s reinstatement, more than 75% were men. After Altman’s return, OpenAI’s new board of directors was composed exclusively of white men, a situation further exacerbated by the predominance of men among the executives. Only in March this year were three women included in the board of directors, but the strongly male-oriented structure probably still prevails in the company’s direction.
On a pragmatic level, the choice of which voice to attribute to GPT-4o was probably based on the criteria of pleasantness and uniformity with other voice assistants, without considering the risk of perpetuating gender stereotypes that are culturally harmful to achieving gender equality. The problem is not just that stereotypes and prejudices are reproduced from skewed real-world data due to the gender gap, the risk is that these stereotypes and prejudices will be multiplied. The algorithm that Amazon used from 2014 to 2018 to recruit programmers is a classic and oft-cited example of how AI can act as a discrimination multiplier. With a large number of resumes of male programmers as their source data, the algorithm could only continue to select men, deeming them preferable based on past experience. In this way, however, the algorithm contributed to increasing the gap between men and women, preferring male hires, until the company realised that it was implementing a discriminatory process.
This speaks volumes about how strategically important it is at this crucial time to have mixed AI working groups where gender difference is represented along with other differences such as birthplace or sexual orientation.
On gender, there is a considerable gap to close. The presence of women in STEM fields is very low. In ICT in particular, the percentage of female graduates remains around 16%, and this figure changes little when we look at women in tech jobs.
We need to break down the cultural barriers that lead girls to undertake paths that exclude scientific disciplines, paths that start to take form in the first years of life, from kindergarten onwards.
But it is crucial to understand that a significant part of the future gender equality game is being played out on this very field. Many organisations dealing with the gender gap still do consider this, or they regard it as outside their scope of action.
Raising awareness and advocacy on these topics could make a difference. But the approach should not just be defensive or alarmist. It should make use of the opportunities offered by artificial intelligence to reveal and counter stereotypes.
That would be a realistic way to try and reduce the number of years that separate us from gender equality.

