CMO

Industry weighs up need for regulation as allegations of child porn on YouTube emerge

Latest brand safety issue sees Australian and international brands suspend advertising with Google and reignites debate into the role and responsibility of digital platform providers

Brands have immediately suspended advertising and the industry is calling for fresh reassurances from digital platform providers around brand safety after allegations that YouTube’s recommended algorithm has been used by paedophiles to connect and promote their activities.  

The latest reports allege YouTube has facilitated paedophiles’ ability to connect with each other, trade contact info, link to child porn in comments, and monetise their efforts.

The claims came after blogger, MattsWhatItIs, alleged he can ‘consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than 10 minutes, in sometimes less than five clicks. Additionally, I have video evidence that these videos are being monetised. CP [child porn] is being traded as well as social media and WhatsApp addresses.’

The allegations saw numerous big Australian brands immediately pulling advertising from YouTube, including Coles, Commonwealth Bank, Coca-Cola, Optus, as well as many others internationally, such as Nestle, Disney and Epic Games.

Coca-Cola, Coles, and the Commonwealth Bank confirmed to CMO they were not running advertising on the Google subsidiary, with the latter stating it  had suspended advertising on YouTube “until this matter is fully investigated and resolved”. Woolworths also confirmed ads are paused and it's continuing to monitor the situation closely with agency partners and Google.

An Optus spokesperson said the telco group immediately paused use of YouTube marketing when it learned of the situation last week. "Optus takes the integrity of our brand very seriously,” the spokesperson stated.

“We have since been in discussions with YouTube on their management of this issue, and believe they are taking appropriate measures for an issue of this serious nature. Optus brand safety features on all campaigns are all set at the highest level. We continue to monitor the situation closely.”

A YouTube spokesperson told CMO it has taken immediate action, deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors, a step that goes well beyond normal protection measures.

“Any content - including comments - that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube,” the spokesperson said. “There's more to be done, and we continue to work to improve and catch abuse more quickly.”

In the past week, YouTube said it has also reviewed and removed thousands of inappropriate comments that appeared against videos with young people in them, and terminated over 400 channels for the comments they left on videos, reporting illegal comments to the National Centre for Missing and Exploited Children (NCMEC) so they can work with the proper authorities. It’s also removed dozens of videos which were posted with innocent intentions but clearly put young people at risk, and taken action against autocompletes that could have increased the discoverability of this content and were against policies.

However, as CEO of NewsMediaWorks, Peter Miller, pointed out, people are tired of the lack of responsibility taken by digital platforms before these issues occur.

"The big platforms are very technologically driven, and yet they don’t take responsibility for what they run," he told CMO. “When something dreadful happens they say they dealt with it after it happened. What’s wrong with that is the ‘after’. With all this technology, how hard it is to get ahead of the problem? Because of the scale of their audiences, there is no such thing as a small mistake.

“This is why big advertising companies are rightly hitting the breaks on these platforms. They need more than assurances that it will be dealt with after the fact, we want assurances they are ahead of the curve.”

Miller urged digital platform players to try harder. “No doubt they are penning vigorous protestations that the ACCC doesn’t need to intervene, but I’m bored with these assurances. Advertisers have been thus far been modest in their reactions, this is the strongest reaction yet, but we’ll see more of it because there are other places to advertise,” he said.

“Trust is an ingredient in intent, and people are more likely to align with brands based on trust.

“There is a big element of people power in this as well. Consumers are getting more vocal and they are reporting these incidents, because it is users who are impacted by this content.

"The traditional media hold themselves to account, and are held to high standards, and the new competitors are not. We need more than just promises. The next six months will be telling, as the world is watching what the ACCC finds and recommends," he said.

Senior consultant at Fiftyfive5, Estelle Gohil, described the latest YouTube advertising horror as an interesting case study illustrating how, “after a limping start, digital players are being forced to put ethics into practice”.

“Google’s swift response is on point and it’s encouraging to see Australian brands using their clout immediately when it matters," she added.

Fiftyfive5 senior consultant, Hannah Krijnen, told CMO YouTube, like Facebook, Twitter and other online sites that connect people and facilitate conversations, is facing a big shift in terms of the onus of responsibility.

“Where once there was a move to put the onus on the individual user, it would seem, regardless of the legal requirements, advertisers are coming down on the side of these sites needing to take some level of responsibility for the content and conversations that happen,” she commented.

“It’s a great example of advertising dollars potentially having the power to shift corporate behaviour – brand safety is extremely important to advertisers and if channels can’t provide reassurance around this, we can expect to see many more shifts in where and how those advertising dollars are spent.”

Such debate also taps into the role of corporations in cultural conversations, Krijnen continued, and when are we OK with such organisations not taking responsibility in a societal context.

“While there are some cases that are clear cut, such as paedophile networks, there will be many others that may challenge our views of free speech and censorship,” she said.

Meanwhile, GroupM Australia and New Zealand CEO, Mark Lollback, said the media agency takes brand safety incredibly seriously and has strict brand safety guidelines in place across all channels.

“There is always careful assessment of client spend on any social media channel or where there is exposure to user generated content and we will work with clients at an individual level to determine the best course of action in light of this issue,” he explained.  

“We understand there are brands that have been affected in Australia. And while the affected spend is incredibly small, and the impressions incredibly low, because of the highly sensitive subject, even one impression is too many. This is a serious issue and we will continue to monitor the situation and are working incredibly closely with Google on this, locally and globally, over next steps and updates.”

This is just the latest of Google’s woes. The Australian Competition and Consumer Commission (ACCC) handed down its interim digital platforms inquiry report in December 2018, calling for more regulatory authority to investigate, monitor and report on how Facebook and Google are dominating news content and advertising services.

The ACCC noted Google and Facebook have become the dominant gateways between news media businesses and audiences, which has led to a growing inability of media businesses monetise their content through advertising, decimated journalism locally, and reduced brand value and recognition of media businesses.

To curb their power, the ACCC recommended a regulatory authority be given the task of investigating, monitoring and reporting on how large digital platforms rank and display advertisements and news content.

In response, however, Google has said it does not believe an algorithm regulator would lead to higher quality search results or promote journalism.

In January, France’s data protection authority fined Google a hefty 50 million Euros (AUD$79.4 million) for breaches of the General Data Protection Regulation (GDPR), the first fine to be levied since the act came into effect in May 2018. Google has appealed.

Follow CMO on Twitter: @CMOAustralia, take part in the CMO conversation on LinkedIn: CMO ANZ, join us on Facebook: https://www.facebook.com/CMOAustralia, or check us out on Google+: google.com/+CmoAu