Privacy expert weighs in on data consent as Google appeals its historic GDPR fine
- 04 February, 2019 10:03
As Google appeals its 50 million Euro GDPR breach fine, a privacy expert warns Australians dealing with data that a pre-ticked box for general consent will no longer cut it.
Chief privacy officer and lawyer at Trūata, Aoife Sexton, said the fine imposed by the National Data Protection Commission (CNIL) was important because the actual violation that attracted the fine was not some obscure point of law. Trūata is a Dublin-based company, whose founding partners are IBM and MasterCard, and was set up last year to provide data anonymisation services for General Data Protection Regulation (GDPR) compliance.
“This decision centres on some of the core principles of the GDPR, which form the foundations of the GDPR, the interpretation and application of which concern all companies who process personal data,” Sexton told CMO.
“We can get insight from this decision on the approach that regulators are going to take on enforcement and also on their strict interpretation of the GDPR core principles. We also can learn where they will set the thresholds for compliance with these core principles.”
Sexton told CMO the French regulator’s decision centres around a violation of the GDPR obligations in a number of areas, including transparency of information, and consent not being validly obtained.
“Information should not be spread across lots of documents and not easily accessible by users. Companies should not cause a user to have to take five or six steps to figure out the full picture about the use of their data,” she explained. “Companies should not describe what they do with user’s data in generic and vague manner. They also need to be very clear on which legal basis they are opting to rely on for each act of processing.
“Consent was not being validly obtained for ads personalisation, for two reasons: The first reason was a user was not sufficiently informed so that it could give proper consent. The information given to users was spread across several documents. This made it difficult for users to get a clear picture on the extent and scale of the processing and which of the multiple Google services, websites and applications would be in play. What this meant was that users would struggle to understand the sheer amount of data that Google would process and combine."
The second reason was the consent from the user was neither 'unambiguous' nor 'specific'.
"In the case of unambiguous, the GDPR specifies that to satisfy the threshold for “unambiguous” consent, a clear affirmative action is needed from the user, for example by ticking a non-pre-ticked box for instance - no pre-ticked boxes allowed. In the case of Google, CNIL found that a pre-ticked box was used," Sexton continued.
Sexton added Google’s fine is the first big fine under the GDPR, and shows regulators are willing to use fines as one of the methods of enforcement.
“This decision actually cuts across all industries and all sectors not just ad tech or marketing. It goes to the heart of some of the core GDPR principles and where the thresholds for compliance have been set," she said.
“It shows regulators are serious about enforcing these thresholds and they are willing to levy significant fines against those who fall short in meeting them. They are also not going to give companies a pass because companies self-declare they are compliant, or say they received some form of user consent.”
The GDPR does present acute challenges for those operating in the global adtech space because the ecosystem is complex and the scale of the data sharing in that ecosystem makes it difficult to clearly explain to users what happens to their data. Similarly, it makes obtaining valid user consent challenging.
For Sexton, the heart of the ruling against Google is how companies treat users and how they explain transparently what they are doing with their user’s data.
"In the case where they are relying on consent, if in fact they have obtained valid consent. The decision is less about what data companies should or should not use,” she said. “For most companies doing data analytics, the CNIL decision is a wake-up call for them to take stock and meaningfully challenge themselves about how they currently provide information and obtain consent from users, and whether their practices will fall short of the thresholds.
“Such companies should take this opportunity to ask themselves some hard questions, such as ‘is there a better way for us to perform analytics?’, “what if we did not conduct analytics on any personal data but instead conducted analytics on non personal data?’ and ‘What if we could avoid sacrificing utility in so doing?”