Home Analysis Are Smart assistants perpetrating misogyny?

Are Smart assistants perpetrating misogyny?

The artificially intelligent virtual assistants are usually characterised as females and this paves the way to sexism, say experts

Smart assistants have made their way into our lives seamlessly. By default, these assistants are characterised as young, heteronormative, amicable women. Assistants like Siri, Alexa, Cortana, or even Google’s Assistant are feminised AI devices present in many households.

Videos and articles about people cussing, probing and passing suggestive remarks to their smart assistants have been doing the rounds for years now. This pertains to a larger question:Are smart assistants normalising inappropriate behaviour towards women?

“When they have these friendly feminine serving personalities, users perceive that they are more open to abuse. The problem is not that we abuse the device, but what we’re doing is abusing a sort of feminine form,” said Jenny Kennedy and Yolande Strengers, researchers who wrote the recent book The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot, in an interview with Mashable. While the assistants themselves aren’t women, their personalities are those of helpful, charming, and submissive “smartwives” who are ready to perform all the labour required to help the house.

Multiple studies, including a detailed UNESCO report titled ‘I’d blush if I could’, show how these assistants respond to slurs and sexual remarks. “I’d blush if I could” was the response Siri gave when told that she was a slut. With the beginning of the #Metoo movement, multiple updates changed how these assistants responded to slurs, but this doesn’t entirely solve the problem of imbibed patriarchy and sexualisation these “smartwives” come with.

siri
Image Credits : dem10/istock.com

The popularity of these devices is increasing in India too. The Electronics spoke to a psychologist about the probable impacts of this.

“For many heterosexual people, preferring a voice of the opposite sex is normal. It is also normal to project feelings and seek fulfillment from movies and now, technology. The problem arises when this behaviour is repetitive, if someone is pre-disposed to dysfunction, and if they start using these devices to avoid human interaction,” said Dr Manoj Kumar Sharma, professor of clinical psychology and coordinator at Service for Healthy Use of Technology (SHUT) Clinic, National Institute of Mental Health and Neuro-Sciences (NIMHANS).

“The anonymity of these interactions makes people disinhibited and they are more likely to go ahead in saying things they wouldn’t publicly. Ofcourse, these interactions might have changed over the lockdown period. With uncertainty and isolation, people are likely to use these devices to sexually cope,” he added.

The Smart assistants also receive flak for creating sexual expectations of women. A petition after Weinstein Gate on social network Care2 garnered thousands of signatures, asking smart assistants like Siri and Alexa to help shut down sexual harassment.

“We know that a human being has created these devices and it implies that jobs of assistants, receptionists, homemakers and secretaries are usually for females. There is a disturbing stereotyping involved here. It perpetrates patriarchy and girls and boys will grow up thinking that this is how women respond and behave,” said Swethlana S, a 22-year-old student and freelance writer.

So, recently, both technologies have been responding to abuses and sexual remarks more strongly. They say, “I will not respond to that” or “I don’t think that’s appropriate”.

A survey conducted during the lockdown by We-Vibe, which manufactures sex toys, has shown that 14% men have sexual fantasies about their smart assistants. “The pandemic is a period of extreme uncertainty and can cause people to behave in ways they wouldn’t otherwise,” said Pavithra Vijaygopal, a consulting psychologist in Mumbai.

“There is a psychological impact of this.You always have a way of getting what you want, however you want it. It certainly changes the way we look at society. Maybe, people are lonelier than ever. It gets problematic when you use technology as an extension of yourself,” she added.

Recently, it was announced that Amitabh Bachchan’s voice will be used for Alexa in India. It remains to be seen if people will behave differently with smart assistants personalised as men. Although, the option of choosing the male voice has been around for a while, users have just stuck to the female characters, which is a default setting too.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Global 5G chipset market forecasted to reach USD 22,929 Million by 2026

The growing demand for mobile data services is expected to increase the 5G chipset market size growth The global 5G Chipset market size is expected...

Government aims to setup one EV charging station for every 69,000 petrol pumps

To speed up the process, the government has also reduced 5% GST on electric vehicles In an effort to boost electric vehicle production in India,...

Samsung’s next wireless earbuds may “ditch” the Bean-shaped design

Samsung gained a lot of attention for its launch of beans-shaped Galaxy Buds Live Earbuds this year. Although at first, it seemed like the...

Why was Infinix Zero 8i launch delayed to Dec 3rd

Earlier in November, reports online began hinting that Infinix is gearing up to launch the new ‘Zero’ Series in India. The Infinix Zero 8...

Recent Comments