You’d never know from Jacqueline Feldman’s background that she’d become a passionate proponent of gender equality for artificial intelligence. She went the dreamer’s route at college, attending Yale for English literature and writing. She prefers casual dresses and writing from the comfort of her Brooklyn apartment surrounded by books, where she has the option of climbing to the roof for cool air on sweltering nights.
But once Feldman was hired to write the personality of a chatbot for Kasisto, a startup that focuses on artificial intelligence software for banks, she became vocal about the importance of taking gender out of the identity equation. Under her watch, MyKai, the bot she was hired to craft a personality for, would be neither female nor male.
Feldman’s boss at Kasisto, Dror Oren, says the work the team has done with the bot made him more outspoken about the need for equality in tech than he’d have imagined going into the project, and he’s a self-proclaimed feminist to begin with. Now, he’s hyperaware of the differences between the personality of Kai and overly feminine answers inside similar products made by most large tech companies.
Kasisto is on to something. There’s Apple’s Siri, which the company occasionally promotes with titillating commercials reinforcing gender stereotypes, like the one where Jamie Foxx flirts with the female virtual assistant, asking if she has a crush on him. There’s Amazon’s Alexa, which the company introduced in a roll-out video featuring a “man of the house” explaining all of the feminized assistant’s functions, while his fictional wife asks one question and gets chastised for it. And then there’s Amy, a bot that schedules meetings via emails that’s made by x.ai. The company proclaims on its site that Amy is asked out about once a month, which the company says makes it “blush.”
Play with any of those products and you’ll find the same flirty attitude promoting the gender stereotypes that make equal-treatment folks irate. Ask it to marry you and Alexa will say, “Sorry, I’m not the marrying type” or “let’s just be friends” to date requests. If you ask Siri “Who’s your daddy?” it will answer “You are…” before asking to get back to work. Microsoft’s Cortana sassily replies, “Of all the questions you could have asked,” to come-ons, something feminists will tell you makes the bot complacent in its harassment.
Kai, on the other hand, will tell users via text to stop bothering it or say it’s time to get back to banking.
Sure, many of those other companies now have a male-voice option, but those aren’t the defaults in the US, and when producing commercials for those products, the female voice is the star of the show.
Feldman says all this sexualized AI can be harmful to society.
“Some of these female-gendered personalities have what are called Easter eggs programmed into them,” said Feldman. “These are supposed to be surprising moments in the interaction, and they’re often jokes that are somewhat demeaning to the personality speaking with you.”
She adds: “If you tried that conversation on a real woman, you’d really be bothering her.”
That’s not to say Easter eggs shouldn’t exist; they’re one of the delights of AI. But rather than demeaning through a typically sexist or flirty joke, Kai will make self-aware jokes about not being alive. If you text it goodbye, it may reply, “That is the X in the top right, right?” When asked if it believes in love, Kai will respond, “Love throws me for a loop. Unconditional love is an infinite loop,” which is a nod to what happens when computers freeze. These sorts of answers make Kai distinctly artificial, not human.
Women continue to earn 79 cents for every dollar a man earns, and it certainly wouldn’t hurt their standing in society if the tech world at least thought more carefully about gender in AI. The stereotypically ladylike, deferential responses of so many virtual assistants reinforce society’s subconscious link between women and servitude. The average person’s only interaction with AI may be a female voice that can’t quite say “no, stop that,” and that’s not OK.
Even those that avoid being overly feminized, like Google’s voice assistant, aren’t entirely gender-free. Google’s lacks a girls name, but still has a woman’s voice. Those in the field will often point to findings like those of now-deceased Stanford professor Clifford Nass, who said people prefer the sound of a woman’s voice to a man’s.
Kasisto was able to avoid some of these tech landmines because Kai’s personality has to be conveyed only by the written word. But the company isn’t buying the idea that society simply prefers a female voice as a reason to keep feminized personalities in a strictly assistant role. In fact, they say, mixing up gender in artificial intelligence in tech would be good for everyone. Companies are clearly thinking about it on some level; for example, in the UK and France, Siri defaults to a man’s voice, unlike the woman’s voice we hear in the US.
“I don’t want to sound pretentious around it, but I think they [ other companies ] need to think seriously about how they’re designing bots,” said Oren, Feldman’s boss and co-founder at Kasisto. “I feel that we’re putting Kasisto values out there. We want to feel proud with the way our bot interacts because it reflects our values as a company.”
Amazon and Google declined to comment for this story, and Apple didn’t respond to requests for an interview. Deborah Harrison, one of Microsoft’s personality writers for Cortana, says the team considered benefits to either gender when beginning to craft the personal assistant but settled on female because they felt women are perceived as being more helpful than men. Still, she said they felt the weight of their decisions.
“This industry — digital assistants and AI research — is in many ways in its infancy, so the interactions we design now will, for better or worse, begin to become standardized through familiarity,” Harrison said via email.
Dr. Olga Russakovsky, a postdoctoral research fellow at the Robotics Institute at Carnegie Mellon, was spurred to action by how tech treats women, period. She told Engadget she started a computer-science camp for girls called SAILORS while at Stanford because of the disproportionately low number of women in the field. In 2011, only 18 percent of bachelor’s degrees and 20 percent of doctoral degrees in computer and information sciences were earned by women.
When designing the camp program, she tailored it to how girls learn, as opposed to conventional programs that tend to favor boys. Part of the problem with sexism in artificial intelligence appears to be that there aren’t enough women involved in its creation.
Russakovsky applauds work by anyone in artificial intelligence who tries to create an environment that includes women as equal beings. This isn’t about an overly PC society getting its dander up over nothing. One study she cites found there is a hidden gender bias within a large sample of news text, randomly sampled, online. She worries these subservient values will grow more entrenched over time, keeping women underrepresented in her field.
It’s possible that some of the loudest criticism of personalities like Cortana (which was initially based on a nude video-game character) has had some effect at large tech companies. Apple added a male voice option to Siri in 2013, two years after Siri was introduced. And personal scheduling software company x.ai introduced a male option a year ago, after debuting with female-only Amy.
But even these maddeningly slow additions might do little to actually reverse sexism within the very DNA of artificial personalities.
Until more people in computer science ‘fess up to the problem of overly sexualized bots, we seem doomed to travel along the same rutted tracks of homogeneous design, with too few women involved in the development of our Siris, Amys, Cortanas and Alexas. That leaves the small teams at companies like Kasisto at the forefront, dragging AI into a more inclusive world. Here’s hoping their colleagues at larger companies wake up and do the same.