by Kate Brodock

Why we desperately need women to design AI

vvvYIBVC75repziqoYy4wbuQxDcyPln9TtgV
Photo by Siyan Ren on Unsplash

At the moment, only about 12–15% of the engineers who are building the internet and its software are women.

Here are a couple examples that illustrate why this is such a big a problem:

  • Do you remember when Apple released its health app a few years ago? Its purpose was to offer a ‘comprehensive’ access point to health information and data. But it left out a large health issue that almost all women deal with, and then took a year to fix that hole.
  • Then there was that frustrated middle school-aged girl who enjoyed gaming, but couldn’t find an avatar she related to. So she analyzed 50 popular games and found that 98% of them had male avatars (mostly free!), and only 46% of them had female avatars (mostly available for a charge!). Even more askew when you consider that almost half of gamers are women.

We don’t want a repeat of these kinds of situations. And we’ve been working to address this at Women 2.0 for over a decade. We think a lot about how diversity — or lack thereof. We think about how it has affected — and is going to affect — the technology outputs that enter our lives. These technologies engage with us. The determine our behaviors, thought processes, buying patterns, world views… you name it. This is part of the reason we recently launched Lane, a recruitment platform for female technologists.

The hands and minds that make technology will have a direct impact on us as humans and on the world around us.

I can’t point to a more topical space than AI and machine learning. It’s coming into almost everything we do — home, finances, shopping, entertainment…you name it.

So, aside from the obvious, why does this matter?

Diversity in, diversity out

You could make the argument that AI is positioned to make one of the largest, most profound changes to humanity that many have ever seen. It touches or will touch most of what we care about and will be built with the ethics, morals, biases and access of the people who create it. This means we need to pay close attention that it represents all users.

But this isn’t a given. Fei-Fei Li, Chief Scientist of Artificial Intelligence and Machine Learning at Google, has worried about this for years.

“If we don’t get women and people of color at the table — real technologists doing the real work — we will bias systems. Trying to reverse that a decade or two from now will be so much more difficult, if not close to impossible. This is the time to get women and diverse voices in so that we build it properly, right? And it can be great. It’s going to be ubiquitous. It’s going to be awesome. But we have to have people at the table.” — Fei-Fei Li
gwi7WR5RalYfOddc7JXKq3mYwHr0Ywcz7WQe
Melinda Gates and Fei-Fei Li of AIForAll. Photo courtesy of Pivotal.

Melinda Gates and Li have founded AI4All. This is a program that targets 9th-grade, underrepresented students and exposes them to AI and machine learning. One of their biggest hurdles? The current pool of AI technical leaders that are diverse themselves is so small that finding representative talent for programming takes a lot of searching and culling.

The values of the engineers building AI will be reflected in the solutions they bring to the table. This may not have an enormous societal impact if you’re building something that picks living room paint colors for you. But when you’re looking to do something like improve cancer care, that’s a different story.

IBM knows this, as they’ve built an avatar that does just that. And it’s genderless.

Harriet Green, IBM’s GM of the Watson IoT part of the business, suggests that the already-existing corporate culture that “lives and breathes diversity” led to this happening. She says, “IBM has mixed engineering teams of both gender and nationality, with members from China, Sri Lanka, Germany, Scandinavia and the UK.”

Manage the behaviors that machines perpetuate

Leah Fessler wrote an eye-opening piece after testing several personal assistant bots to see how they’d stand up to sexual harassment (literally, they sexually harassed the bots, who, by the way, are most often defaulted to female voices unless you change them).

Well, the findings weren’t exactly great. Instead of fighting back against abuse, each bot actually helped entrench sexist tropes through their passivity.

I was particularly drawn to the following quote:

“Siri, Alexa, Cortana, and Google Home have women’s voices because women’s voices make more money. Yes, Silicon Valley is male-dominated and notoriously sexist, but this phenomenon runs deeper than that. Bot creators are primarily driven by predicted market success, which depends on customer satisfaction — and customers like their digital servants to sound like women.”

We could get into a lengthy discussion on how this ties to capitalism and perpetuates historic norms, but Leah pushed even further. Beyond having these bots “be female,” what about how they were treated? What would they do?

Here’s a sampling Fessler provides from her work :

mzwXMOGdfltcylfmXZanBfLVowZ4nyr2Drl3
Siri and Alexa remain either evasive, grateful, or flirtatious, while Cortana and Google Home crack jokes in response to the harassments they comprehend.”

Leah goes on to give several other examples, all of which suggest that the programmers in charge of each of these bots had some level of consciousness when putting together the responses, but fell short in responding to this behavior as explicitly wrong until the word “rape” was introduced (and, as you can see above and in the other examples, some response sets were downright frightening… Siri practically wanted to flirt back!).

And finally:

“While the exact gender breakdown of developers behind these bots is unknown, we can be nearly certain the vast majority are men; women comprise 20% or less of technology jobs at the major tech companies that have created these bots. Thus the chance that male bot developers manually programmed these bots to respond to sexual harassment with jokes is exceedingly high. Do they prefer their bots respond ironically, rather than intelligently and directly, to sexual harassment?”

This is only one example of how having a thought echo chamber (otherwise referred to as a lack of diversity) on your engineering teams for technology that is the closest we have to interacting with humans can reinforce and perpetuate (and exacerbate?) cultural and societal norms that many of us are working so hard to change.

The solution is more women on engineering teams.

There’s plenty of research that concludes that having more women at almost any level of your company — especially in leadership — will have a positive impact on results and a company’s bottom line. Yup, this means more money.

How about specifically for building stuff like, say, AI? Diversity of thought leads to more problem solving. Women are trusted and are more collaborative. Teams with more women are more productive, creative and experimental than all-male teams. Women also write really awesome code.

If we all want to make AI-driven products that solve real problems and are sustainable businesses, we need the best. This is going to require a variety of minds on projects, and that means increasing the number of women on engineering teams.

So go ahead, you can thank us later!