Does AI have gender? We have a tendency to anthropomorphise technology. We name cars, boats, and sometimes even washing machines. And we humanise technical actors in our world and give them human characteristics – fussy, moody, happy, sad. We’re doing the same with AI.

And when it comes to gender, this counts. It counts because it both reflects current social norms and can in part shape how we view gender moving forward.

I have a longstanding interest this topic. This goes back to my PhD days with my work focusing on gender and the history of science and technology. This is why I took up the chance to hear from Professor Gina Neff from the Oxford Internet Institute. Her focus was on examining how explicitly thinking through gender will help designers make AI systems that make better and fairer decisions.

 

How AI gets imagined

Expectations of what it is to be masculine or feminine are culturally embedded. And it starts from a young age. Go into any toy shop and you’ll see what I mean – girls toys tend to be relatively passive while boys toys tend to be more activity focused, building and mechanics. While there are lots of great products on the market that are challenging these norms (see this ad for instance), there are not enough.

So what’s this got to do with AI? So does AI have gender? In images, AI is often imagined as embodied, and commonly this body is female. Voices too, Alexa, Siri, are female, helpful, advisory, suggesting compliance. When the voices are male, they are authoritative, directing, and as embodied, battle hardened, protective.

Giving AI human and gendered attributes is unhelpful. And as Gina notes, these representations have a real impact with a continual replication of what I think of as tired and tedious gender norms – virtual assistants as female, in law and finance, they’re male. It seems we’re yet again building systems that puts authority as a masculine attribute.

 

And there’s another problem

In the US, in 1984, women used to make up around 34% of computer science majors. In 2014 is was  closer to 18%. I don’t think there is any simple explanation to why this happened. It’s likely to be a combination of a number of factors – coding being increasingly seen as something ‘boys’ and ‘men’ do, tech directly targeted to boys and so on.

And this shift is very much culturally specific. This is not the case in India or Iran where there are much higher proportions of women in IT.

There is a clear recognition of the importance of diversity within AI. This has been reflected in pretty much every UK government and think tank report I’ve seen [you can find some key reports in my blog post here]. And I’m seeing diverse teams at many of the events I’m going to.

For me, while having more women in AI is important as a matter of equality, it is not sufficient. In addition to women, teams need to be made up of both women and men from different backgrounds, and from different disciplines. And it makes better business sense. In some research done on this, diverse teams make better business decisions 87% of the time, drive decision making 2 times faster with half the meetings, and improve decision team results by 60%.

 

The decision-makers

A theme, or as I like to think of it, a deep seated concern, I keep coming back to is who will make the decisions going forward? Will we get lazy? What if the computer says ‘no’ when it should be a yes?

For Gina, we need to pay attention to how AI is rolled out into existing organisations, and how it becomes an actor in these organisations. Will AI undermine judgement and voice, particularly that of women?  Who will have the power to question decisions and provide feedback? This is where not just issues of transparency come in, but also explainability and I think GDPR will have a positive impact here.

 

I’d welcome your thoughts – will AI provide different opportunities for questioning power and hierarchies?

 

It will be interesting to see.

 

 

Get in touch

If you have a question or if you’re interested in working with me, or would just like a chat, drop me a message via my contact page.

 

 

Categories: Data & AIEthics