Leila Seith Hassan
Apr 8, 2021

AI must acknowledge all sectors of society to serve everyone fairly

As the industry increasingly embraces AI, it must examine the gender bias within it.

AI must acknowledge all sectors of society to serve everyone fairly

While some mistakes generate funny images, in others, the results can be damaging for individuals, sections of society and brands. Often, it’s considered that the benefits outweigh the risks. But I challenge whether it’s okay to accept that mistakes are an acceptable price to pay or too expensive to fix – we need to have the courage to say when systems are broken or should never have been released for commercial use. 

So, let’s look at how this happens and why it’s often considered too hard to fix.

Simplistically, algorithms – often referred to generically as AI – are built through applying maths to historical data in order to determine an outcome. They rely on what we feed them. The historical data is vast and messy and full of sexist and racist outcomes and other bad things – it’s full of bias. It can reflect how we as a society are not particularly inclusive. What’s more, the math is complex and complicated.

Though we can’t change our past, we can stop looking backwards to determine how inclusive our decisions and outcomes are today.

To make AI less sexist, we need to stop tolerating the following. 

First, stop objectifying women in images.

One of the latest issues considers image databases, which have been problematic in the past due to image labelling and composition of datasets. This particular AI completes images of people. When fed a head and shoulders image of a man, it would add a suit and tie. When fed a head and shoulders image of Alexandria Ocasio-Cortez, the US Representative, it completed an image of her body in a bikini. These algorithms are being used in video-based candidate assessment, among many other uses. They’re being used in marketing and advertising as well. 

Second, stop perpetuating sexist and derogatory stereotypes.

A search for doctors usually results in men, while women are nurses; headmasters are male and teachers are female. Women also seem to live a narrower existence: 76% of the top 200 adjectives belong to men. Females are all the same, it seems. But having this representation online impacts society. It shapes what boys and girls perceive the world to be. And so the cycle continues.

Third, don’t forget that, *ahem*, women also use products.

While this sector continues to expand in everything from car systems to in-home systems and even toys, voice recognition still struggles to recognise women’s voices. We just aren’t considered users of products. Given the lack of women in the tech world, this risks becoming harder and harder to rectify.

Finally, the default to ‘male’ needs to be actively readjusted.

Lenders use loads of data to decide who gets what loans and at what rates. When you get a mortgage, buy insurance, open an account… have you ever noticed which name goes first, despite who is the primary earner in the application? Even when removing gender in applications, let’s consider the way these models have been built on historical – and still prevalent – assumptions about males being the dominant factor. This results in outcomes driven by AI where women receive different vastly poorer credit, credit scores and loans options.

We need to pay attention to these problems in order to move forward.

Although some have found a million creative ways to express misogyny online, algorithms themselves are not misogynistic – maths has no issue with gender. And, while I hope those working on them are not wilfully misogynistic, it’s important to remember that the results of this work translate in ways that can cause harm. 

We need to work to ensure that all sectors of society, including differences in gender, race, economic circumstances and more, are included and acknowledged when designing data sets and algorithmic systems. Crucially though, in this industry we can all make more noise as companies continue to move forward without enough regard for the harmful outcomes that occur. From challenge comes change.


Leila Seith Hassan is head of data science and analytics at Digitas UK

Source:
Campaign UK

Related Articles

Just Published

3 days ago

Publicis climbs the highest in APAC media rankings ...

PHD retains the overall lead, as Omnicom Media Group sees an end-of-year boost from Tata Motors' win, and Publicis Media rockets to the sixth spot.

3 days ago

Netflix is going all out for Squid Game season ...

With a Golden Globe nomination secured even before its release, the record-breaking series returns on December 26, backed by Netflix’s boldest marketing push yet.