ANZ data chief: Marketers must get a better handle on data ethics
- 31 August, 2018 11:09
Building your organisation’s data muscle successfully is about keeping consumer data safe, not losing it and being ethical with how you apply it to deliver end-user value.
Those are the three key pillars of good data culture detailed by ANZ chief data officer, Emma Gray, who shared her views on what it takes to harness data for marketing and experience enhancement in today’s GDPR, customer-fuelled environment at this year’s ADMA Global Forum.
Gray told attendees the capacity to utilise data in consumer engagement is commonly perceived to be virtually limitless, provided organisations have the imagination to apply it and ability to adapt and learn. However, she stressed the need for integrity in how we use data, and said marketers must take responsibility for how data utilisation unfolds or risk severe consumer and brand repercussions.
“Marketers couldn’t be happier – real-time insights are driving real-time interactions with customers at that vital moment when we’re most captive. It’s a fabulous place to be,” Gray said. “Big data is awesome. We’re effortlessly reaching out to customers, leveraging your customers, which I buy the locations of from someone else’s customers, and all at scale. It seems to leave a lot of time for lunch, right?
“Wrong. It actually represents a sea change in the nature of marketing in terms of skillset requirements, interactions across businesses and with regards to the role of ethics in the lexicon. It’s not a nice to have, it’s a fundamental backbone in the new commercial reality.”
Any utilisation of data has to start with consent, something that’s been highlighted by the introduction of Europe’s GDPR laws in May this year, Gray said. She noted GDPR is being felt everywhere, either in terms of specific legislation or influence.
“It’s a seismic shift. The intent is to give consumers an ability to control their data and right to be forgotten. In any of us doubted we are in the age of the digital consumer, it’s definitely a wake-up call,” she said. “If you’re using my data for anything, you better be giving me some value… if you’re sharing data with another, you better have my permission. And if you fail on basic recognition, I’ll punish you and if you give that data to someone else without my permission, I’ll end my relationship with you.”
Gray then stressed three aspects of data use of particular concern: The Black Box effect, people like you, and unintended consequences.
The Black box effect refers to algorithms written by companies who keep the secret behind these secret, she explained. “Maybe we don’t have the confidence or training to interrogate these or IP comes with opacity – we trust then execute against outcomes that may not be fair,” she said. Examples include the US judicial system handing down sentence 22 per cent longer on average for black men than white based on unfair bias in datasets.
People like me refers to inferred data sets, combined through algorithms in order to make smart, predictive assumptions. This could be how to allocate advertising, through to police on the street based on demographics, to the likelihood of an individual to commit a crime based on people like you. These tangential data sets are designed to help us get as perfect as we can using information through inference but can lead to negative consequences for organisations that tap these blindy, Gray said.
The third is unintended consequences. An example Gray pointed to is the Strava data visualisation heatmap, which showed the activity of all users. “They never imagined 13 trillion data points would be significant threat to US military security,” she said.
All of us have role to play in responsible use of data as the big data advances, and Gray emphasised the importance of ethical data use as a critical muscle for all organisations to build. So how do you get there?
“First up, question what data sources can we use to create a proposition? Are they ethical and fair within the specific context? What data points are actual and historic versus tangential and inferred?” Gray asked. “Have we interrogated the algorithm for unintended bias? And does it pass the ‘sunshine test’ – are we happy to share with customers the information we have on them and what we do with it?”
Good ethics don’t date, Gray continued. “Some information that looks innocent such as height, could be seen as a proxy for gender when capturing gender isn’t allowed,” she said.
“We need to hold ourselves accountable for the business outcomes, but there are social and human outcomes and we need to hold ourselves accountable for those too.
“We need to create clear guardrails to keep customers and employees safe. And who writes the guardrails? All of us should be contributing – there are no shortcuts and if you don’t built privacy into design from the get go it can mean multimillion dollar remediation bills.”
Gray also suggested marketers who think they’ve got the skills to deal with the whole data strategy right now are “deluding themselves”. “How we deal with data really challenges the role of marketing and I see so many marketers not equipped to move into this space,” she said.
“If I was a CMO of a company, I’d want to make sure the people around me had significant amounts of training in not just statistics, but in data science, had been to coding camps and gone through their paces on ethical data use.”
In terms of how ANZ is coping with the need for everyone to better understand data use, Gray pointed to the organisation-wide adoption of Agile, aimed at shortening timeframes from back to front and improve cross-functional collaboration. Another big step forward is embracing human-centred design.
Ultimately, data’s best use is going to come down to customer context, Gray concluded. “One of the challenges is understanding the basis of the relationship with customers – usually within a certain context,” she said.
“It’s always going to be about test and learn to see if the customer is giving permission to go further. As a bank, we’re risk averse and afraid to ask customers how much further we can go with them. But based on context, asking if we can take it one step further is a very healthy practice. Of course, if you don’t understand the context in the first place, you get it wrong.”