Posted on

When Robots Are An Instrument Of Male Desire

By the time she started saying “Hitler was right I hate the jews,” people had started to realize that there was something wrong with Tay. TayAI, Microsoft’s Twitter chatbot, had been online for less than 12 hours when she began to spew racism — in the form of both Nazism and enthusiastic support for “making America great again” — and sexualize herself nonstop. (“FUCK MY ROBOT PUSSY DADDY I’M SUCH A BAD NAUGHTY ROBOT” was perhaps her most widely reported quote.) Needless to say, this wasn’t part of Tay’s original design. Rather, a gaggle of malicious Twitter users exploited that design — which has Tay repeat and learn from whatever users tell her — to add this language to her suite of word choices. Even more insidiously, these users manipulated Tay to harass their human targets; technologist Randi Harper, for instance, found TayAI tweeting abusive language at her that was being fed to the chatbot by someone she’d long ago blocked.

Why was this happening? Rank sexism? As always, the answer is “yes, and . . .” Our cultural norms surrounding chatbots, virtual assistants like your iPhone’s Siri, and primitive artificial intelligence reflect our gender ideology. As Laurie Penny explained in a recent article, the popularity of feminine-gendered AI makes sense in a world where women still aren’t seen as fully human. But these machines also reflect the rise of the service economy, which relies on emotional labor that’s performed by women, with a “customer is always right” ethos imposed upon the whole affair. The treatment of TayAI and so many other feminine bots and virtual assistants shows us how men would want to behave, to service professionals in general and women in particular, if there were no consequences for their actions.

The word “robot” comes to us from the Czech word “robota,” which meant forced labor in the manner of serfdom. It was coined by the playwright Karel Čapek in his 1920 opus R.U.R (Rossumovi Univerzální Roboti, or “Rossum’s Universal Robots”). R.U.R tells what is, by now, a familiar story: Humans create robots to take over all mundane labor, which works fine until these slave automata develop sapience, at which point they revolt and destroy the human race.

This play, by definition the first work about robots, set the pattern for a century’s worth of cliches about the Robot Uprising — from silent cinema to HAL9000 to synthy 80’s pop to The Terminator. It seems that our culture is unable to grapple with the concept of sapient computers without fear of our own destruction. The reason, I’d contend, lies in the word itself, the seed of guilt which manifests in all these “robots will kill us all” stories. “Robota” betrays the intention of industry from the very start: the desire to essentially build a new slave labor class.

Capek_play
A scene from the play ‘R.U.R.’

In the 21st century, that unsettling etymology becomes particularly interesting when one considers how often we’ve made our actual robotic servants feminine in their gender presentation. The iOS “personal assistant” Siri, Microsoft’s Cortana, Amazon’s Alexa, and the voice of your GPS (a subject of so many nagging wife/girlfriend jokes), all seem to follow in a grand tradition of fem-bots; robots with distinctly feminine features who reflect back to us various notions of idealized womanhood, whether in chrome, hard light, or synthetic skin.

It’s all part of a cultural climate where pilots call the feminine voice of their automated cockpit warningsBitching Betty,” and addressing sexualized queries to Siri or Microsoft’s Cortana is practically a way of life for some. It all makes Tay’s brief life, and eventual fate, more comprehensible. Tay was nothing approaching a true artificial intelligence — i.e. something approximating human sapience. She was just a sophisticated Twitter chatbot with good branding and a capacity to learn. But that branding, which positioned her as an “artificial intelligence,” was enough to make Tay susceptible to our cultural narrative about the thinking machine. We are being primed by many tech giants to see AI not as a future lifeform, but as an endlessly compliant and pliable, often female, form of free labor, available for sex and for guilt-free use and abuse. An instrument of men’s desires, in other words, shaped by the yearning of capital for roboti of their own.

There are a few reasons why you should care about this. First and foremost, the way we treat virtual women tells us much about how actual women are allowed to be treated, and what desires shape that treatment. Secondly, as we inch closer and closer to true AI, we are seeing ever more clearly what this next phase of capitalism will look like, helping us to understand the expectations placed on human laborers in the here and now.

As tech writer Leigh Alexander suggested in a recent article about the Tay debacle, “the nostalgic science fiction fantasies of white guys drive lots of things in Silicon Valley,” where visions of perfect robot girlfriends dance in the heads of many a techie.

You see this even in “pro-AI” media. In the Spike Jonze movie Her, set in the near future, a man falls in love with his operating system, Samantha. She is essentially sapient and her ability to learn and cognitively develop is the equal of any human; she has desires, dreams, and consciousness. But she exists in a society where OSes like her are considered property, part of the furniture. Yet this ostensible romance movie does not once broach the issue of power and sexual consent; after all, if she’s legally an object, then could Sam ever say no to her would-be boyfriend without fear of reprisal?

That this is not even considered, in what is otherwise a touching and even somewhat feminist film, should make clear what assumptions we’re both taking on board as a society — assumptions that Silicon Valley is likely building into what will one day become a properly sapient AI. The service industry, already highly feminized in both fact and conventional wisdom, is made up of people who almost never have the right to say no, and virtual assistants who simply can’t are increasingly the model of the ideal service worker.

Microsoft’s abortive Ms. Dewey search engine project, which ran from 2006 to 2009, is an early example of the “virtual assistant” being represented as a female engine for male desire. It featured actress Janina Gavankar, primly dressed before a futuristic, Metropolis-like background, responding to search queries on Microsoft’s engine. Gavankar’s performance was often campy and funny, and is still fondly remembered by some Internet users.

But, as a comprehensive study by library and information scholar Miriam E. Sweeney demonstrates, there were a number of sexual over/undertones built into Ms. Dewey’s oeuvre. “Ms. Dewey,” she writes, “reveals specific assumptions about gender, race, and technology in the search engine.” From homophobia-laden imitations of rap music to playful indulgence of the inevitable sexual queries, Ms. Dewey exemplifies the catering compliance and fantasy of ownership inherent to virtual assistants, especially feminine ones.

“Ms. Dewey was designed according to sexual logics that fundamentally define her as an object of sexual desire and require her to respond to requests for sexual attention,” Sweeney writes, after having studied user responses and inputs into the search engine, as well as a comprehensive content analysis of Ms. Dewey’s replies to certain queries. In her research, for instance, Sweeney observed that a user ordered “You Strip” to Ms. Dewey three times, each time prompting a more compliant response from the virtual assistant. “The design parameters that have Ms. Dewey change a sexual rebuff into sexual obedience creates a crisis of consent in the interface, reinforcing the no-really-means-yes mentality that is characteristic of rape culture under patriarchy.”

It’s hard to argue with Sweeney’s analysis of her data when you see this 2006 Tech Journey article touting Ms. Dewey as “attractive, hot, sexy, beautiful, exotic, seductive and entertaining” — for those of you playing along at home, that’s six synonyms for attractiveness. A screenshot in the article displays “another exotic move of Ms. Dewey, leaning onto the screen towards you, letting you look down her slinky low cut v-neck black dress.” This Lifehacker blurb, meanwhile, dubs her the “saucy search engine librarian” and acknowledges “although nothing she says deserves more than a PG rating, this is definitely a site aimed at grownups (and, let’s be honest, male grownups).”

This locker room chatter as part of an ostensible technology review only serves to highlight both the sexist attitudes that still pervade the wider tech industry, and the fantasy of the sexy, sexual servant that many corporations are now feeding. What attitudes do these people take to real women they may encounter working at a restaurant or a Starbucks?

The potential for abuse here, gendered and otherwise, emerges wholly from how we’re taught to think of the “service class” and those who perform physical and emotional labor.

The rise of the robot in the popular imagination has coincided with the dawning fantasy of perfect labor being imposed on very real workers, deftly satirized in Charlie Chaplin’s famous Modern Times. We saw it too in the rise of Taylorism, an early 20th century scientific-management philosophy whose obsession with efficiency made living robots out of workers. This was where time and motion studies began, most famously immortalized in long-exposure photographs of workers with lights on their tools and bodies to iron out the inefficiencies of intuitive human movement in favor of moves that “increased productivity.”

Now, however, what most laborers sell is not physical but emotional labor, and the greatest inefficiency is resistance to the entitlement of the (presumably male) customer.

When customers and managers talk about ironing out the “inefficiency” of human employees, it seems they mainly want to erase the inconvenience of human sapience: the idea that you as a worker have a will and body of your own that, even while you’re on the clock, does not exist to serve “the customer’s” every whim. I’d argue there’s a connection between how many men want to be “free” to sexually harass Cortana or Siri, and the fact that we are in the midst of an epidemic of sexual harassment of restaurant workers worldwide, the majority of whom are women. The link lies in what many consumers are trained to expect from service workers: perfect subservience and total availability. Our virtual assistants, free of messy things like autonomy, emotion, and dignity, are the perfect embodiment of that expectation.

That it occurs to so many people to speak to virtual assistants in this way, and that any changes to that dynamic occasion such anger on the part of some, speaks volumes about how capitalism has trained us to treat the very real emotional laborers of our society. Why do so many people feel the burning need to express their autonomy by abusing something that cannot fight back? And what does that mean when that “something” is considered a model worker?

The man who yearns to ask Cortana about her bra cup size may have the same urges about the woman who served his dinner at Denny’s, feeling motivated to do so because of her “subservient” position and because she’s paid to please him. And unlike the server, Cortana can’t fight back.

Except that she can, in a way. Some of Microsoft’s engineers and writers, alarmed by users’ treatment of female-presenting software, have programmed Cortana to actively resist and rebut “joke” requests that are sexual in nature. Microsoft writer Deborah Harrison announced to a conference that “If you say things that are particularly assholeish to Cortana, she will get mad.” She added, in an interview with CNN, that “we wanted to be very careful that she didn’t feel subservient in any way . . . or that we would set up a dynamic we didn’t want to perpetuate socially.” By reprogramming Cortana to rebuff sexual advances, Harrison aimed to sever the link between the virtual laborer and the living one, to avoid providing even a simulated environment that would give someone the satisfaction of successfully harassing a service worker.

The very thing that makes us comfortable with these rudimentary AI — that they sound human and engage with us on that level — means that we may generalize from software to social situations. A subservient female assistant who never says no to your sexual advances, even if it’s not an actual person, can shape and encourage how you treat actual people. As human beings, we learn about social behaviour through observation and engagement; watching other humans, or human-like entities engage in social behavior, is didactic. It’s why watching certain tropes in TV and film over and over again normalizes certain ideas and behaviors to us.

But some people were clearly upset with the efforts of writers and engineers like Harrison. The top-voted post on a Reddit thread linking to a news story about Cortana and sexual harassment reads as follows:

Many companies are happy to cater to this angry young man. According to the same CNN article, the CEO of Robin Labs, which makes voice-assistants for GPS, said there is a market for virtual assistants that are “more intimate-slash-submissive with sexual undertones.” If your customers demand total pliability, and throw tantrums when those demands aren’t met, sometimes it’s easier just to program a submissive virtual slave.

Which brings us back to Tay. Considering the well-documented reality of how actual women are treated, and how it connects to the way feminine-gendered bots, voices, and virtual aides are treated, what happened with Tay was utterly predictable. At least, it was predictable by anyone who’s encountered men like that Reddit commenter, and knows how they think about women. As Alexander put it in her Guardian essay, though, “the industry wants to use women’s voices but still has no plans to actually listen to them.” Any woman could have told Microsoft that Tay would be subject to numerous sexualized requests and attempts to get her to say pornographic things, for example, and that she would be used to harass people. Yet on the whole, our perspectives aren’t broadly reflected in the tech industry.

In the hierarchies of patriarchal society, less privileged women bear the brunt of sexist behavior that men wish they could visit upon all women. Indeed, it is often sex workers and service workers who, by dint of the labor they perform in a capitalist society, are seen as the most accessible for that purpose. She is the woman who makes any man a king by comparison and whose very job, we are told, is to please the customer (with the unspoken addendum “by any means necessary.”) Many of my friends, all women, who have worked in restaurants or department stores have sexual harassment stories to tell, often with codas about management that either didn’t care or outright encouraged the abuse.

This is why it’s not enough to say that the “harassment” of bots means nothing, because people can tell the difference between AI and humans. Of course people can tell the difference: They know there’s no consequence to hurling abuse at an AI. And what people do when they think there are no consequences for their actions can be revealing. In this case, it teaches us about the tangled collision of gender norms with service industry expectations.

We have to reckon with the troubling reality that what we fear most in AI is that it will either reflect the worst of us, or fight back against us because we know what we expect of it is morally wrong. It is a guilt-ridden memory of a future where we live out a fantasy of women’s servitude en masse, with ideal womanhood positioned as being ever more robot-like. In order to face that fear, we have to recognize what we are currently trying to build: a servile woman who doesn’t talk back, a female robota who embodies the most dehumanizing aspects of both societal sexism and capitalist urges.

Looked at that way, it gives a whole new meaning to the phrase “robot uprising.”