The problem with this, says the report, is that it ‘reflects and reinforces’ the idea that assistants – acting in a support role – are female …

The report is titled I’d blush if I could, which used to be one of Siri’s responses to being addressed as a slut.

The report is particularly concerned about the subconscious message sent to children when IAs are female by default, as they are exposed to intelligent assistants from a young age.

Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’. The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.

As voice-powered technology reaches into communities that do not currently subscribe to Western gender stereotypes, including indigenous communities, the feminization of digital assistants may help gender biases to take hold and spread. Because Alexa, Cortana, Google Home and Siri are all female exclusively or female by default in most markets, women assume the role of digital attendant, checking the weather, changing the music, placing orders upon command and diligently coming to attention in response to curt greetings like ‘Wake up, Alexa’.

A secondary issue, it argues, is the way that IAs respond flirtatiously to offensive comments.

Professor Noble says that the commands barked at voice assistants – such as ‘find x’, ‘call x’, ‘change x’ or ‘order x’ – function as ‘powerful socialization tools’ and teach people, in particular children, about ‘the role of women, girls, and people who are gendered female to respond on demand’. Constantly representing digital assistants as female gradually ‘hard-codes’ a connection between a woman’s voice and subservience.

According to Calvin Lai, a Harvard University researcher who studies unconscious bias, the gender associations people adopt are contingent on the number of times people are exposed to them. As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically.

According to Lai, the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants – and penalized for not being assistant-like. This demonstrates that powerful technology can not only replicate gender inequalities, but also widen them.

Siri, for example, used to respond to “You’re a slut” with replies which included “I’d blush if I could” and “Well, I never!” Apple has since changed the response to “I don’t know how to respond to that.”

In 2017, Quartz investigated how four industry-leading voice assistants responded to overt verbal harassment and discovered that the assistants, on average, either playfully evaded abuse or responded positively. The assistants almost never gave negative responses or labelled a user’s speech as inappropriate, regardless of its cruelty.

Siri defaults to a female voice in most countries, but, curiously, has some exceptions.

An Indiana university study back in 2017 found that both men and women prefer a female voice, finding it welcoming, warm and nurturing – however, preferences varied by content in rather stereotypical ways.

Photo: Shutterstock