Data scientist: Black box algorithms should not be applied to human outcomes

Algorithms must be transparent, accountable, and interpretable, says University of Sydney data science lecturer and expert

Where human outcomes are the goal, black box algorithms should not be allowed, a data scientist from the University of Sydney believes.

Speaking to CMO ahead of the University’s Ethics of Data Science conference next week, Dr Roman Marchant, lecturer and data scientist at the University of Sydney, said that while it is common practice for big companies to use black box algorithms to come up with an output, they should not be applied to the lives of humans.

This is because you cannot ‘open’ the black box and discover how the algorithm is coming up with the outcomes it does, so it is not transparent, accountable, or interpretable, he said.

“It is common for companies to use black box algorithms, otherwise known as deep learning or neural networks, and spit out an output,” Dr Marchant said. “We do use them for some problems, but it’s very tricky to use them on human data, and we are completely against using them for when the result affects the lives of humans, because you cannot open the box and uncover what the algorithm is doing in the background.

“The type of problems where we do apply these models is in applications where no humans can be affected, such as for automated machinery and so on. But, when it comes to humans, we believe these models shouldn’t be allowed.”

While data and its analysis to improve the customer journey and personalisation is key to marketing now, it is far from the silver bullet many marketers believe it to be, as most data contains inherent bias.

Related: Why bias is the biggest threat to AI development

Beware of AI inherent biases and take steps to correct them

According to the University of Sydney, algorithms are a fundamental tool in everyday machine learning and artificial intelligence, but experts have identified a number of ethical problems. Models built with biased and inaccurate data can have serious implications and dangerous consequences, ranging from the legal and safety implications of self-driving cars and incorrect criminal sentencing, to the use of automated weapons in war.

Before data is used, Dr Marchant advised it be evaluated by an inter-disciplinary team to ensure it is not full of errors, both for the sake of not stereotyping, but also to improve the customer experience.

“The models that are built are very generic and could be used for what outcome you want. All the models are fairly similar, except for the response variable. But there is a concern around biased data, and most companies don’t know what the exiting bias is in the data,” he explained.

“For example, we all have the right to be offered the same product. Usually you wouldn’t discriminate and only offer a product to a certain subset of the population. However, if a data set has an internal bias that males are more likely to buy a product than females, you may end up only marketing that product to men, even though perhaps all your marketing to date has been aimed at men, which then means only men are buying it, which affects the data."

Context around the data is very important, Dr Marchant continued. "Companies need to be able to quantify and correct that bias, which is tricky to do," he said.  “To do so, we have to consider the decisions that have been made in the past, to take into account when building the models.

“Companies must start taking into account other explanatory variables, and they need to be constantly revising models and thinking about what they are doing. Casual effects need to be uncovered, not just correlations and predictions with black box models."

This is why transparency so a third party can assess what algorithm company is using with personal data is vital. "Interpretability means the algorithm needs to be understandable, to enable someone to understand why the algorithm came up with the prediction it did. It means you can make the algorithm accountable for existing problems, like bias,” Dr Marchant said.

“This is not only for the protection of a customer, it also means you can open the box and understand how an algorithm is coming up with predictions and therefore understand your customer in a better way."

The University of Sydney also recommends companies use one of the multiple institutes and data centres, like The Gradient Institute for example, or an internal team of research engineers who can consult with third parties to analyse and study algorithms used, the way they are making decisions, and extract patterns from the data to indicate whether the data is fair or not.

“Ultimately, if a big company has a lot of data and using it to make decisions, they should have an internal team that does that, plus an external data team to evaluate that everything is transparent and done according to law," Dr Marchant said. 

The government also has a responsibility to work on laws that protect both companies and users to allow and relieve conflicts around using data, Dr Marchant said. As a step forward, he noted the Human Rights commission of Australia has already assembled a group to examine how AI affects human rights.

Follow CMO on Twitter: @CMOAustralia, take part in the CMO conversation on LinkedIn: CMO ANZ, join us on Facebook: https://www.facebook.com/CMOAustralia, or check us out on Google+:google.com/+CmoAu

 

 

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.
Show Comments

Latest Videos

Conversations over a cuppa with CMO: The Star's George Hughes

It's been an incredibly tough three months for the Star as it shut its doors and stood down staff in response to the COVID-19 lockdown. Yet innovation has shone through, and if the CMO, George Hughes, has anything to say about it, such lateral thinking will continue as we start to recover from the crisis.

More Videos

One failing brand tying up with another failing brand!

Realist

Binge and The Iconic launch Inactivewear clothing line

Read more

I am 56 years old and was diagnosed with Parkinson's disease after four years of decreasing mobility to the point of having family dress ...

Nancy Tunick

The personal digital approach that's helping Vision RT ride out the crisis

Read more

I am 57 and diagnosed in June 2009. I had a very long list of symptoms, some of which were. Keeping right arm close to my side while walk...

Nancy Tunick

Gartner survey: CMO spending hit by COVID-19

Read more

Audible did such a great job on their marketing and at the same time, there are no false promises. The support, quality, variety all good...

Vitaliy Lano

Audible's brand plan to build the value of audiobooks

Read more

I am 56 years old and was diagnosed with Parkinson's disease after four years of decreasing mobility to the point of having family dress ...

Nancy Tunick

Parkinson's NSW creates a lorem ipsum generator and goes digital to mark Parkinson's Awareness month

Read more

Blog Posts

MYOD Dataset: Building a DAM

In my first article in this MYOD [Make Your Organisation Data-Driven] series, I articulated a one-line approach to successfully injecting data into your organisation’s DNA: Using a Dataset -> Skillset -> Mindset framework. This will take your people and processes on a journey to data actualisation.

Kshira Saagar

Group director of data science, Global Fashion Group

Business quiet? Now is the time to review your owned assets

For businesses and advertiser categories currently experiencing a slowdown in consumer activity, now is the optimal time to get started on projects that have been of high importance, but low urgency.

Olia Krivtchoun

CX discipline leader, Spark Foundry

Bottoms up: Lockdown lessons for an inverted marketing world

The effects of the coronavirus slammed the brakes on retail sales in pubs, clubs and restaurants. Fever-Tree’s Australia GM Andy Gaunt explains what they have learnt from some tricky months of trading

Andy Gaunt

General manager, Fever-Tree Australia and New Zealand

Sign in