Nothing made suspicious Sofia Cetrulo that when he came from his native Uruguay, he was not only going to double university degree, but was already taking its first steps in the investigation. And what a way to start this investigative journey! The recent International Student Congress has awarded his work on the gender neutrality in virtual assistants.
An investigation that questions gender roles in smart technology software. Or what is the same, because those of the virtual assistants of Apple, Microsoft, Amazon and Google are female voices. Its main conclusion is definitive: there are gender biases and social prejudices, a product of the lack of diversity in the sector that develops them.
The reason for this research responds to your innate curiosity. “In the HEAVEN they gave the students an Alexa. One day, while driving, I got the question why virtual assistants were mostly ‘women’», He assures. After discussing it with the teacher Maite Pastor, he embarked on the adventure. That innocent question ended up leading to such a current and relevant research area and she began to see herself as a student and researcher.
The first thing he did was contextualize the phenomenon. According to the research company Junpiter Research, by the end of 2023 there will be more virtual assistants in the world than people. And it is the virtual assistants of Apple, Microsoft, Amazon and Google that concentrate 90% of the market, both in volume and frequency of use. In addition, in all of them, beyond the extension of the software, a coincidence occurs: their names are female, their voices are female. “Their own creators even refer to these assistants with the pronoun ‘she’, further reaffirming the collective imagination of female virtual assistants”, warns Sofía.
For this reason, this student and researcher set out to determine the reason why most virtual assistants are exclusively or by default “female”. He also wanted to establish the relationship between the feminization of virtual assistants and gender biases and expose the consequences of this association of the female gender.
In her research, through content analysis, the direct link between the adjectives that describe virtual assistants and the gender stereotypes with which they are normally associated with women is verified. “That is why female voices in virtual assistants are not due to a question of tone of voice, but rather derives from a culture of stereotyping of women».
Likewise, the work points to the role of the algorithms of these virtual assistants. «When algorithms are fed with information loaded with social prejudices, they learn from them and have the power to reproduce and enhance thems », he warns. “What’s more,” he says, “at present, there are no optimization algorithms that allow us to identify these biases.”
Consequently, according to Sofia, they become a powerful tool for the proliferation of social prejudicess. “Affecting the way in which people (and, fundamentally, children) internalize and understand gender roles.” Probably, in the opinion of this young researcher, it can be improved just as great achievements have been made in the fight against social prejudice. «Generating a change in the culture of technology is of vital importance. To make it more inclusive and respectful of society as a whole ”, points out this student and researcher.
As a result of this work, which has allowed her to expand her knowledge, Sofía perceives other advantages. “For example, prepare for future situations where I again have to defend work in front of academic courts.” Actually, the most satisfying thing about this experience has been being able to research a topic that you are passionate about and find relevant. “Having received recognition for this is a plus that increases my happiness and my desire to continue on this path,” he concludes
Eddie is an Australian news reporter with over 9 years in the industry and has published on Forbes and tech crunch.