Is artificial intelligence riddled with bias?

Katja Forbes

Katja Forbes is an Australian pioneer in the field of experience design and all of its components – research, emerging technology as well as service design, customer experience (CX) and user experience (UX). She is the managing director of Designit, Australia and New Zealand, a strategic design firm who work with ambitious brands to create high-impact products, services, systems and spaces – that people love. Katja is proud to be International Director on the Global Board of the Interaction Design Association Board (IxDA). Back in 2014, Katja founded syfte, a specialist business in research and experience design. In late 2018, her business was acquired by the international firm Wipro, and she was announced Australian Managing Director of Designit. Katja was also a co-founding member of IxDA Sydney. Together with Joe Ortenzi, she has built a community of over 1600 designers in Sydney, providing them with learning opportunities via lecture-based meetups that draw a crowd of 150 people each time, a mentoring program and workshops. One of Katja’s personal motivations is to inspire other women, especially in her industry, to reach toward their definition of professional success. She is a sought-after keynote speaker for various organisations and associations including Women In Design, Leaders in Heels, Women In Commerce, Code like a Girl (Melbourne Knowledge week), Inaugural CMO Conference 2017, Telstra and Macquarie Bank. Katja was recognised as one of the Top 10 Australian Women Entrepreneurs 2018 by My Entrepreneur Magazine and named one of the 100 Women of Influence by Westpac and the Australian Financial Review in 2016.


The purpose of Artificial Intelligence (AI) has always been to replace the menial and repetitive tasks we do each day in every sector, so that we can concentrate on doing what we do best. Saving time and money has certainly been a decent outcome as AI infiltrates the business landscape, however, now we are starting to see problems that cause major issues in practice.  

One of the first major problems that created bias in its solution was recorded back in 2014, when Amazon created an AI tool to evaluate job seekers’ resumes. The tool worked perfectly aside from one major concern – it turned out Amazon was biased towards men.    

In other words, it turned out the algorithm Amazon staff had literally spent years creating, housed this fatal flaw. In the end, the only thing that could be done was to abandon the AI altogether, as it couldn’t guarantee it wouldn’t show bias toward male applicants.  

The discrimination shown by the AI tool puzzled many at first. Obviously, it wasn’t specifically created to be discriminatory toward any job seeker, regardless of their sex, age or nationality.  However, this discrimination happened organically. Developers of this tool realised they were showing it the resumes of all those who had secured positions over the past 10 years, and, naturally, more men had applied for tech-related positions and been successful, simply because this is how this male-dominated industry has always been skewed.  

This example demonstrates the problems with creating AI tools using the past as an example of creating a machine-operated future. The main aim is clearly to create a future where menial tasks don’t factor and staff can focus on the more important skilled tasks. However, it is easy to see unwanted nuances can creep in when these tools are being created, such as a strong preference for resumes for male applicants simply because it has been exposed to successful past employees, and these turned out to be male.  

The thing is AIs cannot afford to be biased, but exposing it to biased data will simply ensure the AI will take it on-board.  

We are now left with the following dilemma. We all want to have access to artificial intelligence that will assist in our workplaces and create consistent and objective results. However, to assist with its learning, all we have is data that already exists, and this data is often far from objective.   

Since the data we have access to in order to create various AI tools is bias in nature, is there a way we can strip that back to ensure our AI tools have a clean slate and are free from discrimination?  

As AI bias is a real concern that simply cannot be accepted, a number of tools have been created to erase it. For example, IBM has recently launched a tool that will scan for bias in AI algorithms and recommend adjustments in real time.   

Named the AI Fairness 360 Python package, it includes a comprehensive set of metrics for datasets and models to test for biases, explanations for these metrics, and, importantly, algorithms to mitigate bias in datasets and models. Tools like this are becoming increasingly popular in demand as AI increases in use and starts making more important decisions on our behalf.   

We simply cannot rely on AI use that has shown bias in any way, shape or form, as our entire democracy is eroded simply for the sake of a technological solution that we created to make our lives easier.

 

 

Tags: digital marketing, customer experience management

Show Comments

Featured Whitepapers

State of the CMO 2019

CMO’s State of the CMO is an annual industry research initiative aimed at understanding how ...

More whitepapers

Latest Videos

Conversations over a cuppa with CMO: Coles Group's Lisa Ronson

​In this week's instalment of Conversations over a Cuppa with CMO, we talk with Coles Group CMO and our former #1 in the CMO 2018, Lisa Ronson, about how the supermarket giant has approached marketing and customer engagement and how she's coped with the transformative and significant impact of the COVID-19 crisis as a leader and brand strategist.

More Videos

Modernization on marketing to promote products and business is really a big leap especially the age of social media. Thanks for sharing s...

Brayden Manchee

How National Tiles used digital personalisation to deliver 15 per cent of revenue online

Read more

Great write-up. I wrote an article about ASMR as well and the top ASMRtists:https://medium.com/illumina...

Dexx Mason

ASMR: Flash in the marketing pan, or something more?

Read more

Nice to be visiting your blog once more, it has been months for me. best mp3 converter

Yolanda R. Skillman

Melbourne Fashion Week: Using digital and insight to drive engagement and attendance

Read more

Typically I visit your web journals and get refreshed through the data you incorporate yet the present blog would be the most obvious bes...

Yolanda R. Skillman

What automated design is going to do to 3D printing and product customisation

Read more

I am overpowered by your post with such a decent theme. best mp3 converter

Yolanda R. Skillman

Report: Accountability key to marketing's influence in business

Read more

Blog Posts

The 10 commandments of marketing in COVID times

With social and economic uncertainty and the changing political landscape, how can CMOs adapt to seize opportunity?

Duncan Wakes-Miller

GM, marketing, Audika Australia and New Zealand

Why direct response advertising is winning this year

In response to the COVID-19 crisis, brands around the globe are going into hibernation and waiting out the ongoing storm. CMOs have dramatically slashed their budgets across every single form of media, digital included.

Sabri Suby

Founder, King Kong

Taking back control of your tech

To win in customer experience, brands need to take back control of their technology.

Michael Titshall

VP, managing director, R/GA Australia

Sign in