siri – The Establishment https://theestablishment.co Mon, 22 Apr 2019 20:17:33 +0000 en-US hourly 1 https://wordpress.org/?v=5.1.1 https://theestablishment.co/wp-content/uploads/2018/05/cropped-EST_stamp_socialmedia_600x600-32x32.jpg siri – The Establishment https://theestablishment.co 32 32 What Happens When Four Anti-iPhone, Salty-Ass Texan Women Argue About Cats https://theestablishment.co/what-happens-when-four-anti-iphone-salty-ass-texan-women-argue-about-cats-1af463769f4-2/ Fri, 06 Apr 2018 21:28:22 +0000 https://theestablishment.co/?p=2516 Read more]]>

In my family, the presence of Siri has fundamentally, and forever, changed us.

Illustration: Sophia Foster-Dimino

By Andrea Grimes

They say that smartphones are tearing us apart. That technology is building walls, not tearing them down. That the internet makes us dumber.

Not in my family. In my family, the presence of a goofy ole gal named Siri has fundamentally, and forever, changed us. For the better.

Every family has its special holiday traditions and quirks. Some folks all wear matching pajamas for Christmas morning, others all share a beloved pancake or latke recipe, or escape to their favorite skiing locale.

My family argues about facts. Dates of birth, dates of death, celebrity marriages, the lengths of various wars both foreign and domestic. The ingredients in all manner of pies and desserts. The temperature at which meat is safe to eat. Which QVC host rang in New Year’s Eve 1998 while shilling for Craftsman tools? (Or was it the Kirks Folly jewelry spectacular that year?) If it’s a question that definitely has an answer, my family is definitely not just going to find out what that answer is and do literally anything else with the precious and limited gift of life.

I have tried to warn my friends and boyfriends about this habit ahead of time. They never believe me. Until it’s too late. My mom and dad once spent the entirety of a half-hour car ride to dinner arguing in front of my new friend Susan about the nature of real estate purchases on cruise ships. What if, my dad suggested, you could buy a condo on a cruise ship? You could live on a cruise ship!

My mother was not having it.

< twang > “But Tommy, who would want to live on a cruise ship? You’d just visit the same places over and over and over again.” < / twang>

My dad countered: < twang > “No you wouldn’t! The ship would go all over the world!” < / twang >

Hey, quick question: Do you guys know if you can buy a house on a cruise ship? Do you know whether, if you did, that ship would go to like, the same five destinations, or if it would go all over the world? I DON’T KNOW EITHER. NEITHER DID MY MOM. OR MY DAD. OR SUSAN, WHO NEARLY CHOKED HERSELF TO DEATH TRYING NOT TO AUDIBLY LAUGH HER WAY OUT OF THE CAR. This didn’t happen in 1995. This happened in 2010, 12 years after the invention of Google and three years after God gave us the iPhone.

But did my parents ask Siri, “Can you buy a room on a cruise ship” or “if I lived on a cruise ship where would it go”? They did not. It was pure fucking speculation all the way to the Olive Garden.

But the Grimes family comeuppance was on its way. And it came in the form of a meek, three-word rebuttal, uttered by my dearest and sweetest aunt, sugar personified, sweetness incarnate: Cindy. Cindy is the youngest of my mom’s three sisters, and while she failed to cultivate the brash smartassedness characteristic of her sisters, her capacity for generosity and quiet affection is unparalleled. Cindy does not start shit.

Until the year she started some shit.

It was Christmas, the year of the great Cruise Ship Debate. I’d brought my boyfriend, now my husband, home to meet my family for the first time. Things were going well. We had not had a protracted fall-out over whether Richard Nixon had died in the spring or the fall, so I was hopeful, but nervous, especially since big ordeal holidays were not really Patrick’s family’s thing. Patrick’s family just sort of of gets together whenever it’s convenient, because his parents are divorced, like normal people willing to end their factual forever-wars in a draw.

But Christmas lunch went great. We ate at 2 p.m. and were on track to continue grazing, as we do, until one of my aunts remembers that her cats haven’t eaten in 14 hours and the party breaks up.

My aunt Cindy does not start shit. Until the year she started some shit.

Cats are important here. My family is a cat family. Growing up, we had anywhere from six to 20 cats at any given time. My mom, who is an actual genius, went back to school at age 50 and got a veterinary degree so she could take care of more cats. I have heard my family argue about the temperature at which sand becomes glass, but I have never heard them argue about cats. There’s nothing to discuss. Because we don’t just know about cats — the things we know about cats? ARE FUCKING FACTS.

So this is like, nine hours into grazing on turkey and dressing and cream taters and this jello-coolwhip-pineapple thing that we call “pink stuff,” and me and Patrick and my mom and her three sisters, and my dad, are all staring at our phones because we love each other a lot, and my aunt Terri pipes up to read this news story she found on Facebook about this puma they spotted in the woods in East Texas.

Now, my family doesn’t argue about cats but they will argue about East Texas, where they are all from. Are there pumas in East Texas? Well — I mean, this news story seemed to indicate that there are! That was not good enough for my aunt Carla. Carla is the un-Cindy.

< twang throughout > “There ain’t pumas in East Texas. They mean mountain lions.” Carla is the oldest sister. She is 68 years old and she has never been wrong.

Bad Advice On Family-Destroying Cat Worship

But Cindy wondered, ever so gently: “I think pumas and mountain lions are the same thing?” But here’s what: You don’t just suggest, to Carla Fay Baker’s face, that Carla Fay Baker doesn’t have a real solid grasp on the taxonomy of the big cats of her ancestral homeland.

My aunt Terri is just trying to read the story: “Well, anyway, it says there was a puma — ”

Carla: “THERE AIN’T PUMAS IN EAST TEXAS.”

My mom: “Well now, but they could mean jaguars.”

Questions that were explored by my mom and her sisters over the next ten minutes include: What is a jaguar? What is a puma? Is it “jag-yar” when it’s a car, and “jag-u-war” when it’s a cat? Does a jaguar have spots? Is a jaguar a kind of leopard, or is it more like a solid-colored cheetah? How big does a big cat have to be?

Are all wild cats “big cats,” or are some, such as the North African sand cat, which is a small cat, simply wild, but not big? Housecats: More closely related to jaguars, or pumas, if in fact jaguars are not pumas? Do mountain lions have to live in the mountains?

Did I mention that my family does not drink? Or consume mind-altering substances of any kind? This is just straight up, four salty-ass Texas women with giant Texas hair telling each other things they’ve heard about big cats as if Christ himself crawled out of the manger and issued to Carla, Becky, Terri and Cindy each a different, but equally accurate, individual gospel in feral feline biology.

Finally, Carla shut that shit down.

“A PUMA IS NOT A MOUNTAIN LION. A PUMA IS A JAGUAR. AND THERE AIN’T NEVER BEEN NO JAGUARS IN EAST TEXAS.”

Well, it was settled. Because Carla said it was settled. After a few moments of quiet reflection on her decree, my dad cranked up the volume on the Longhorns game. I sipped the last of my coffee and started thinking about a final serving of pink stuff. My mom, cowed into silence once again by the only woman on earth who can out-cat her, resigned herself to flipping through Southern Living’s annual best recipes book.

Are all wild cats ‘big cats,’ or are some, such as the North African sand cat, which is a small cat, simply wild, but not big?

And then my aunt Cindy looked up from her phone. My sweet, demure, dear- hearted aunt — who come to think of it, had been unusually quiet. Turns out she’d mostly spent the last few minutes consulting the tiny experts locked in her bejeweled phone case, straightens up a little in her chair.

“Carla?” she squeaks, holding the screen of her phone up over her cup of super-creamed coffee. “This says not.”

No one on earth could have put together three more shocking words. “I eat dicks.” “Chili has beans.” “Jesus was gay.” Nothing, and I mean nothing, would have done it quite like “This says not.”

Carla was stunned. Cindy proceeded to read the wikipedia article about mountain lions — also known as “pumas.” I quote: “They are a large felid of the subfamily Felinae native to the Americas. Their habitats range from the Canadian Yukon to the southern Andes of South America. Also known as a cougar, the mountain lion is the most widespread of any large wild terrestrial mammal in the Western Hemisphere.”

Not only are pumas and mountain lions the same thing, but they are more likely than other big cats to be found ANYWHERE, including East Texas.

Seven years later, there is no putting the fact-cat back in the bag. Everyone has learned to use their iPhones. Whenever a debate gets rolling, the phones come out and appeals to Wikipedia are made. “We don’t have to wonder!” I find myself shouting over the din of discord as somebody fails to remember who was quarterbacking for the Texas Longhorns in 1985. (It was Bret Stafford.)

The Vietnam war started in 1955 and ended in 1975. A macaron is a meringue-based sandwich cookie, while a macaroon has coconut and is dipped in chocolate. Sand turns to glass at 3,090 degrees fahrenheit. Richard Nixon died in April. Beef, pork, veal and lamb cuts should be heated to 145 degrees, and ground meats to at least 160. Poultry of all types should be cooked to 165 degrees. It was the Craftsman tools special on QVC in 1998.

And a puma is a mountain lion.

Now we have nothing left to talk about. Just like a real family.

Looking For A Comments Section? We Don’t Have One.

]]> When Robots Are An Instrument Of Male Desire https://theestablishment.co/when-robots-are-an-instrument-of-male-desire-ad1567575a3d/ Wed, 27 Apr 2016 15:45:48 +0000 https://theestablishment.co/?p=8543 Read more]]> By the time she started saying “Hitler was right I hate the jews,” people had started to realize that there was something wrong with Tay. TayAI, Microsoft’s Twitter chatbot, had been online for less than 12 hours when she began to spew racism — in the form of both Nazism and enthusiastic support for “making America great again” — and sexualize herself nonstop. (“FUCK MY ROBOT PUSSY DADDY I’M SUCH A BAD NAUGHTY ROBOT” was perhaps her most widely reported quote.) Needless to say, this wasn’t part of Tay’s original design. Rather, a gaggle of malicious Twitter users exploited that design — which has Tay repeat and learn from whatever users tell her — to add this language to her suite of word choices. Even more insidiously, these users manipulated Tay to harass their human targets; technologist Randi Harper, for instance, found TayAI tweeting abusive language at her that was being fed to the chatbot by someone she’d long ago blocked.

Why was this happening? Rank sexism? As always, the answer is “yes, and . . .” Our cultural norms surrounding chatbots, virtual assistants like your iPhone’s Siri, and primitive artificial intelligence reflect our gender ideology. As Laurie Penny explained in a recent article, the popularity of feminine-gendered AI makes sense in a world where women still aren’t seen as fully human. But these machines also reflect the rise of the service economy, which relies on emotional labor that’s performed by women, with a “customer is always right” ethos imposed upon the whole affair. The treatment of TayAI and so many other feminine bots and virtual assistants shows us how men would want to behave, to service professionals in general and women in particular, if there were no consequences for their actions.

The word “robot” comes to us from the Czech word “robota,” which meant forced labor in the manner of serfdom. It was coined by the playwright Karel Čapek in his 1920 opus R.U.R (Rossumovi Univerzální Roboti, or “Rossum’s Universal Robots”). R.U.R tells what is, by now, a familiar story: Humans create robots to take over all mundane labor, which works fine until these slave automata develop sapience, at which point they revolt and destroy the human race.

This play, by definition the first work about robots, set the pattern for a century’s worth of cliches about the Robot Uprising — from silent cinema to HAL9000 to synthy 80’s pop to The Terminator. It seems that our culture is unable to grapple with the concept of sapient computers without fear of our own destruction. The reason, I’d contend, lies in the word itself, the seed of guilt which manifests in all these “robots will kill us all” stories. “Robota” betrays the intention of industry from the very start: the desire to essentially build a new slave labor class.

Capek_play
A scene from the play ‘R.U.R.’

In the 21st century, that unsettling etymology becomes particularly interesting when one considers how often we’ve made our actual robotic servants feminine in their gender presentation. The iOS “personal assistant” Siri, Microsoft’s Cortana, Amazon’s Alexa, and the voice of your GPS (a subject of so many nagging wife/girlfriend jokes), all seem to follow in a grand tradition of fem-bots; robots with distinctly feminine features who reflect back to us various notions of idealized womanhood, whether in chrome, hard light, or synthetic skin.

It’s all part of a cultural climate where pilots call the feminine voice of their automated cockpit warningsBitching Betty,” and addressing sexualized queries to Siri or Microsoft’s Cortana is practically a way of life for some. It all makes Tay’s brief life, and eventual fate, more comprehensible. Tay was nothing approaching a true artificial intelligence — i.e. something approximating human sapience. She was just a sophisticated Twitter chatbot with good branding and a capacity to learn. But that branding, which positioned her as an “artificial intelligence,” was enough to make Tay susceptible to our cultural narrative about the thinking machine. We are being primed by many tech giants to see AI not as a future lifeform, but as an endlessly compliant and pliable, often female, form of free labor, available for sex and for guilt-free use and abuse. An instrument of men’s desires, in other words, shaped by the yearning of capital for roboti of their own.

There are a few reasons why you should care about this. First and foremost, the way we treat virtual women tells us much about how actual women are allowed to be treated, and what desires shape that treatment. Secondly, as we inch closer and closer to true AI, we are seeing ever more clearly what this next phase of capitalism will look like, helping us to understand the expectations placed on human laborers in the here and now.

As tech writer Leigh Alexander suggested in a recent article about the Tay debacle, “the nostalgic science fiction fantasies of white guys drive lots of things in Silicon Valley,” where visions of perfect robot girlfriends dance in the heads of many a techie.

You see this even in “pro-AI” media. In the Spike Jonze movie Her, set in the near future, a man falls in love with his operating system, Samantha. She is essentially sapient and her ability to learn and cognitively develop is the equal of any human; she has desires, dreams, and consciousness. But she exists in a society where OSes like her are considered property, part of the furniture. Yet this ostensible romance movie does not once broach the issue of power and sexual consent; after all, if she’s legally an object, then could Sam ever say no to her would-be boyfriend without fear of reprisal?

That this is not even considered, in what is otherwise a touching and even somewhat feminist film, should make clear what assumptions we’re both taking on board as a society — assumptions that Silicon Valley is likely building into what will one day become a properly sapient AI. The service industry, already highly feminized in both fact and conventional wisdom, is made up of people who almost never have the right to say no, and virtual assistants who simply can’t are increasingly the model of the ideal service worker.

Microsoft’s abortive Ms. Dewey search engine project, which ran from 2006 to 2009, is an early example of the “virtual assistant” being represented as a female engine for male desire. It featured actress Janina Gavankar, primly dressed before a futuristic, Metropolis-like background, responding to search queries on Microsoft’s engine. Gavankar’s performance was often campy and funny, and is still fondly remembered by some Internet users.

But, as a comprehensive study by library and information scholar Miriam E. Sweeney demonstrates, there were a number of sexual over/undertones built into Ms. Dewey’s oeuvre. “Ms. Dewey,” she writes, “reveals specific assumptions about gender, race, and technology in the search engine.” From homophobia-laden imitations of rap music to playful indulgence of the inevitable sexual queries, Ms. Dewey exemplifies the catering compliance and fantasy of ownership inherent to virtual assistants, especially feminine ones.

“Ms. Dewey was designed according to sexual logics that fundamentally define her as an object of sexual desire and require her to respond to requests for sexual attention,” Sweeney writes, after having studied user responses and inputs into the search engine, as well as a comprehensive content analysis of Ms. Dewey’s replies to certain queries. In her research, for instance, Sweeney observed that a user ordered “You Strip” to Ms. Dewey three times, each time prompting a more compliant response from the virtual assistant. “The design parameters that have Ms. Dewey change a sexual rebuff into sexual obedience creates a crisis of consent in the interface, reinforcing the no-really-means-yes mentality that is characteristic of rape culture under patriarchy.”

It’s hard to argue with Sweeney’s analysis of her data when you see this 2006 Tech Journey article touting Ms. Dewey as “attractive, hot, sexy, beautiful, exotic, seductive and entertaining” — for those of you playing along at home, that’s six synonyms for attractiveness. A screenshot in the article displays “another exotic move of Ms. Dewey, leaning onto the screen towards you, letting you look down her slinky low cut v-neck black dress.” This Lifehacker blurb, meanwhile, dubs her the “saucy search engine librarian” and acknowledges “although nothing she says deserves more than a PG rating, this is definitely a site aimed at grownups (and, let’s be honest, male grownups).”

This locker room chatter as part of an ostensible technology review only serves to highlight both the sexist attitudes that still pervade the wider tech industry, and the fantasy of the sexy, sexual servant that many corporations are now feeding. What attitudes do these people take to real women they may encounter working at a restaurant or a Starbucks?

The potential for abuse here, gendered and otherwise, emerges wholly from how we’re taught to think of the “service class” and those who perform physical and emotional labor.

The rise of the robot in the popular imagination has coincided with the dawning fantasy of perfect labor being imposed on very real workers, deftly satirized in Charlie Chaplin’s famous Modern Times. We saw it too in the rise of Taylorism, an early 20th century scientific-management philosophy whose obsession with efficiency made living robots out of workers. This was where time and motion studies began, most famously immortalized in long-exposure photographs of workers with lights on their tools and bodies to iron out the inefficiencies of intuitive human movement in favor of moves that “increased productivity.”

Now, however, what most laborers sell is not physical but emotional labor, and the greatest inefficiency is resistance to the entitlement of the (presumably male) customer.

When customers and managers talk about ironing out the “inefficiency” of human employees, it seems they mainly want to erase the inconvenience of human sapience: the idea that you as a worker have a will and body of your own that, even while you’re on the clock, does not exist to serve “the customer’s” every whim. I’d argue there’s a connection between how many men want to be “free” to sexually harass Cortana or Siri, and the fact that we are in the midst of an epidemic of sexual harassment of restaurant workers worldwide, the majority of whom are women. The link lies in what many consumers are trained to expect from service workers: perfect subservience and total availability. Our virtual assistants, free of messy things like autonomy, emotion, and dignity, are the perfect embodiment of that expectation.

That it occurs to so many people to speak to virtual assistants in this way, and that any changes to that dynamic occasion such anger on the part of some, speaks volumes about how capitalism has trained us to treat the very real emotional laborers of our society. Why do so many people feel the burning need to express their autonomy by abusing something that cannot fight back? And what does that mean when that “something” is considered a model worker?

The man who yearns to ask Cortana about her bra cup size may have the same urges about the woman who served his dinner at Denny’s, feeling motivated to do so because of her “subservient” position and because she’s paid to please him. And unlike the server, Cortana can’t fight back.

Except that she can, in a way. Some of Microsoft’s engineers and writers, alarmed by users’ treatment of female-presenting software, have programmed Cortana to actively resist and rebut “joke” requests that are sexual in nature. Microsoft writer Deborah Harrison announced to a conference that “If you say things that are particularly assholeish to Cortana, she will get mad.” She added, in an interview with CNN, that “we wanted to be very careful that she didn’t feel subservient in any way . . . or that we would set up a dynamic we didn’t want to perpetuate socially.” By reprogramming Cortana to rebuff sexual advances, Harrison aimed to sever the link between the virtual laborer and the living one, to avoid providing even a simulated environment that would give someone the satisfaction of successfully harassing a service worker.

The very thing that makes us comfortable with these rudimentary AI — that they sound human and engage with us on that level — means that we may generalize from software to social situations. A subservient female assistant who never says no to your sexual advances, even if it’s not an actual person, can shape and encourage how you treat actual people. As human beings, we learn about social behaviour through observation and engagement; watching other humans, or human-like entities engage in social behavior, is didactic. It’s why watching certain tropes in TV and film over and over again normalizes certain ideas and behaviors to us.

But some people were clearly upset with the efforts of writers and engineers like Harrison. The top-voted post on a Reddit thread linking to a news story about Cortana and sexual harassment reads as follows:

Many companies are happy to cater to this angry young man. According to the same CNN article, the CEO of Robin Labs, which makes voice-assistants for GPS, said there is a market for virtual assistants that are “more intimate-slash-submissive with sexual undertones.” If your customers demand total pliability, and throw tantrums when those demands aren’t met, sometimes it’s easier just to program a submissive virtual slave.

Which brings us back to Tay. Considering the well-documented reality of how actual women are treated, and how it connects to the way feminine-gendered bots, voices, and virtual aides are treated, what happened with Tay was utterly predictable. At least, it was predictable by anyone who’s encountered men like that Reddit commenter, and knows how they think about women. As Alexander put it in her Guardian essay, though, “the industry wants to use women’s voices but still has no plans to actually listen to them.” Any woman could have told Microsoft that Tay would be subject to numerous sexualized requests and attempts to get her to say pornographic things, for example, and that she would be used to harass people. Yet on the whole, our perspectives aren’t broadly reflected in the tech industry.

In the hierarchies of patriarchal society, less privileged women bear the brunt of sexist behavior that men wish they could visit upon all women. Indeed, it is often sex workers and service workers who, by dint of the labor they perform in a capitalist society, are seen as the most accessible for that purpose. She is the woman who makes any man a king by comparison and whose very job, we are told, is to please the customer (with the unspoken addendum “by any means necessary.”) Many of my friends, all women, who have worked in restaurants or department stores have sexual harassment stories to tell, often with codas about management that either didn’t care or outright encouraged the abuse.

This is why it’s not enough to say that the “harassment” of bots means nothing, because people can tell the difference between AI and humans. Of course people can tell the difference: They know there’s no consequence to hurling abuse at an AI. And what people do when they think there are no consequences for their actions can be revealing. In this case, it teaches us about the tangled collision of gender norms with service industry expectations.

We have to reckon with the troubling reality that what we fear most in AI is that it will either reflect the worst of us, or fight back against us because we know what we expect of it is morally wrong. It is a guilt-ridden memory of a future where we live out a fantasy of women’s servitude en masse, with ideal womanhood positioned as being ever more robot-like. In order to face that fear, we have to recognize what we are currently trying to build: a servile woman who doesn’t talk back, a female robota who embodies the most dehumanizing aspects of both societal sexism and capitalist urges.

Looked at that way, it gives a whole new meaning to the phrase “robot uprising.”

]]>