Is artificial intelligence riddled with bias?

Katja Forbes

Katja is an Australian pioneer in the field of experience design and all its components.Back in 2014, Katja founded syfte, a specialist business in research and experience design acquired by Wipro in 2018. She was then appointed Australian MD of Designit. Katja was also a co-founding member of the Interaction Design Association Board (IxDA) in Sydney, helping build a community of over 1600 designers. Today, Katja is international director on IxDA’s global board. Katja is a sought-after speaker for organisations including Women In Design, Leaders in Heels, Women In Commerce, Code like a Girl, Telstra and Macquarie Bank. Katja was recognised as a Top 10 Australian Women Entrepreneurs 2018 by My Entrepreneur Magazine and one of the 100 Women of Influence by Westpac and the Australian Financial Review in 2016.


The purpose of Artificial Intelligence (AI) has always been to replace the menial and repetitive tasks we do each day in every sector, so that we can concentrate on doing what we do best. Saving time and money has certainly been a decent outcome as AI infiltrates the business landscape, however, now we are starting to see problems that cause major issues in practice.  

One of the first major problems that created bias in its solution was recorded back in 2014, when Amazon created an AI tool to evaluate job seekers’ resumes. The tool worked perfectly aside from one major concern – it turned out Amazon was biased towards men.    

In other words, it turned out the algorithm Amazon staff had literally spent years creating, housed this fatal flaw. In the end, the only thing that could be done was to abandon the AI altogether, as it couldn’t guarantee it wouldn’t show bias toward male applicants.  

The discrimination shown by the AI tool puzzled many at first. Obviously, it wasn’t specifically created to be discriminatory toward any job seeker, regardless of their sex, age or nationality.  However, this discrimination happened organically. Developers of this tool realised they were showing it the resumes of all those who had secured positions over the past 10 years, and, naturally, more men had applied for tech-related positions and been successful, simply because this is how this male-dominated industry has always been skewed.  

This example demonstrates the problems with creating AI tools using the past as an example of creating a machine-operated future. The main aim is clearly to create a future where menial tasks don’t factor and staff can focus on the more important skilled tasks. However, it is easy to see unwanted nuances can creep in when these tools are being created, such as a strong preference for resumes for male applicants simply because it has been exposed to successful past employees, and these turned out to be male.  

The thing is AIs cannot afford to be biased, but exposing it to biased data will simply ensure the AI will take it on-board.  

We are now left with the following dilemma. We all want to have access to artificial intelligence that will assist in our workplaces and create consistent and objective results. However, to assist with its learning, all we have is data that already exists, and this data is often far from objective.   

Since the data we have access to in order to create various AI tools is bias in nature, is there a way we can strip that back to ensure our AI tools have a clean slate and are free from discrimination?  

As AI bias is a real concern that simply cannot be accepted, a number of tools have been created to erase it. For example, IBM has recently launched a tool that will scan for bias in AI algorithms and recommend adjustments in real time.   

Named the AI Fairness 360 Python package, it includes a comprehensive set of metrics for datasets and models to test for biases, explanations for these metrics, and, importantly, algorithms to mitigate bias in datasets and models. Tools like this are becoming increasingly popular in demand as AI increases in use and starts making more important decisions on our behalf.   

We simply cannot rely on AI use that has shown bias in any way, shape or form, as our entire democracy is eroded simply for the sake of a technological solution that we created to make our lives easier.

 

 

Tags: digital marketing, customer experience management

Show Comments

Featured Whitepapers

More whitepapers

Latest Videos

More Videos

looking for the best quality of SMM Panel ( Social Media Marketing Panel ) is a website where People Buy Social Media Services Such as Fa...

Kavin kyzal

How to manage social media during Covid-19

Read more

Thank you for sharing your knowledge. Definitely bookmarked for future reading! Check this website https://a2designlab.com/ with lots of ...

Pierce Fabreverg

Study: Gen Z are huge opportunity for brands

Read more

Thanks for sharing. You might want to check this website https://lagimcardgame.com/. An up and coming strategic card game wherein the cha...

Pierce Fabreverg

Board games distributor partners with Deliveroo in business strategy pivot

Read more

Such an important campaign, dyslexia certainly need more awareness. Amazing to see the work Code Read is doing. On the same note we are a...

Hugo

New campaign aims to build understanding around scope and impact of dyslexia

Read more

Great Job on this article! It demonstrates how much creativity, strategy and effort actually goes to produce such unique logo and brandin...

Pierce Fabreverg

Does your brand need a personality review? - Brand vision - CMO Australia

Read more

Blog Posts

A few behavioural economics lesson to get your brand on top of the travel list

Understanding the core principles of Behavioural Economics will give players in the travel industry a major competitive advantage when restrictions lift and travellers begin to book again. And there are a few insights in here for the rest of the marketing community, too.

Dan Monheit

Co-founder, Hardhat

Predicting the Future: Marketing science or marketing myth?

Unicorns, the Sunken City of Atlantis, Zeus: They are very famous. So famous in fact, that we often think twice about whether they are real or not. Sometimes if we talk about something widely enough, and for long enough, even the strangest fiction can seem like fact. But ultimately it is still fiction - stories we make up and tell ourselves over and over until we believe.

Kathy Benson

Chief client officer, Ipsos

Winning means losing in the game of customer retention

At a time of uncertainty and economic hardship, customer retention takes on much greater importance. CX Lavender’s Linda O’Grady examines the big grey area between ‘all’ and ‘best’ customers when deciding who is worth fighting for and how.

Linda O'Grady

Data Strategy Partner & Business Partner, CX Lavender

Sign in