Siri wasn’t programmed to be a Social Justice Warrior. Feminists want to change that.
Hank Berrien reports: A woman writing for Quartz.com laments that bots such as Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Google Home exhibit signs of submissiveness that not only reflect the feelings of dominance among men but also reinforce the concept that women are made to be submissive.
“People often comment on the sexism inherent in these subservient bots’ female voices, but few have considered the real-life implications of the devices’ lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated.”
I spent weeks sexually harassing bots like Siri and Alexa—their responses are, frankly, horrific. It’s time tech giants do something. https://t.co/z5FGBruxWU
— Leah Fessler (@LeahFessler) February 22, 2017
Leah Fessler writes, “People often comment on the sexism inherent in these subservient bots’ female voices, but few have considered the real-life implications of the devices’ lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated.”
“By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated.”
And this: “Justifications abound for using women’s voices for bots: high-pitched voices are generally easier to hear, especially against background noise; fem-bots reflect historic traditions, such as women-operated telephone operator lines; small speakers don’t reproduce low-pitched voices well. These are all myths. The real reason? Siri, Alexa, Cortana, and Google Home have women’s voices because women’s voices make more money.”
“People tend to perceive female voices as helping us solve our problems by ourselves, while they view male voices as authority figures who tell us the answers to our problems.”
And the usual false statistics: “Even if we’re joking, the instinct to harass our bots reflects deeper social issues. In the US, one in five women have been raped in their lifetime, and a similar percentage are sexually assaulted while in college alone; over 90% of victims on college campuses do not report their assault.” Read the rest of this entry »
We all have relationships with tech. The question is: how far do you go?
Gareth May writes: How would you feel if you walked in on your flatmate pouring his iPhone a glass of Cristal and remarking on her exceptional ‘wallpaper’? Open mouthed and curious, right? Well, welcome to the future. For some technophiles at least.
Apart from neo-Luddites, we all have relationships with tech. The question is: how far do you go?
A survey from 2012 revealed the extreme level of attachment many of us feel towards our gadgets. It found that three-quarters of the 2,500 people polled said losing a personal device would give them more anxiety than losing a wedding ring. Another from 2016 discloses that nearly 40pc of millennials say they interact more with their smartphones than their co-workers, parents, children or friends.
Of course, such interactions could be simply checking the time – and if you started calling your friends to do that they’d think you’d gone batty – but what these stats betray is the shaping of an emotional bond between man and machine that seems to be growing year on year.
“People use things in a sexual way all the time. You could name any object, from a radiator to a tin can, and there’s someone out there that gets sexually aroused by it.”
Sci-fi programmes like Humans, depicting the trouble caused when overly lifelike AI get mixed in with the rest of society, may be fictional, yet our relationship with tech still gets closer and closer. Quite alarmingly close.
Virtual assistants (VA for short), also known as personal assistant A.I.s, are digital secretaries that can schedule meetings, order meals, play audio and visual files, and assess online accounts. Not to be confused with ‘virtual assistants’ that work remotely and are actual people, current VAs on the market include Amazon’s Alexa, Microsoft’s Cortana, and, of course, Apple’s Siri.
“Most new technology seems to turn to porn eventually. Webcams, virtual reality, Internet etc. I see no reason why A.I won’t be included in this. It’ll certainly be cheaper to run phone sex lines with an army of bots instead of having to pay women to answer the phones.”
— Steve Worswick, an expert in the field of digital A.I
Last month, in an interview with The Times, Illy Eckstein, chief executive of Robin Labs, creators of a virtual assistant and satnav known as ‘Robin’, said that 5pc of interactions in their database are classified as “clearly sexually explicit”.
Trawling the Internet for evidence of the above I discovered a Reddit forum titled: ‘I masturbate to Siri and I feel disgusting’. The poster says he’s a 20 year old male, who started talking to Siri sexually as a joke before realising that “it really turned me on.”
The phenomenon clearly has farther reaches than one sole forum post. VA creators and chatbot companies predict such interactions and put algorithmic safeguards in place to deter feelings of emotional and sexual attachment from costumers.
Earlier this year one of the key writers for Microsoft’s Cortana, Deborah Harrison, revealed at the Virtual Assistant Summit in San Francisco that “a good chunk of the volume of early-on inquiries” regarded Cortana’s ‘sex life’ adding, “That’s not the kind of interaction we want to encourage.”
Steve Worswick is an expert in the field of digital A.I. He’s also the leading developer of Mitsuku, a family-friendly online chatbot.
“Mechanophilia: love or sexual attraction to computers, cars, robots or androids, washing machines, lawnmowers and other mechanized gardening equipment, sexual relations between living organisms and machines.”
He told Telegraph Men that he used to have a banning system (five strikes and you’re out) for anyone who attempted to have sexually explicit conversations with Mitsuku. However, he received so many emails from people who wanted to treat the bot sexually, that he removed the strike system and instead programmed Mitsuku to either ignore sexual requests, say something to steer the conversation to other topics, or simply insult the user.
Worswick believes men are using Mitsuku in this way and seeing bots as “sex objects” simply because they cannot fight back, have no legal rights, and are not going to judge them or contact the authorities or their wives or girlfriends.