New evidence emerged on Monday (Jan.28th) outlining the extent of Google’s collection of ‘intimate’ user data after a human rights organisation submitted a complaint to data protection authorities in the UK, Ireland, and Poland.
Evidence filed by Panoptykon Foundation, a Warsaw-based digital rights organisation, reveals how ad companies, such as Google, illegally profile internet users’ religious beliefs, ethnicity, disabilities and sexual orientation.
Dr. Johnny Ryan from web browser, Brave, Jim Killock at Open Rights Group and Michael Veale of University College, filed a complaint in September last year, claiming the use of such data is unlawful and violates EU’s GDPR legislation, which came into force in May 2018.
Personal data categories
The data, which is obtained from users’ browsing history, is routinely attached to web browsers to better target customers through advertisements. Once these connections have been made, the data is then shared with third-party companies via real-time ad auctions.
For example, one category marked on IAB’s listing is “IAB7-28 Incest/Abuse Support”. This could enable ad auction companies to target an internet user as an incest or abuse victim.
The letters IAB refer to the Interactive Advertising Bureau, the body that develops industry standards, conducts research, and provides legal support for the online advertising industry.
“Ad auction systems are obscure by design,” said Katarzyna Szymielewicz, president of Panoptykon Foundation, which filed the complaint in Poland.
“Lack of transparency makes it impossible for users to exercise their rights under GDPR. There is no way to verify, correct or delete marketing categories that have been assigned to us, even though we are talking about our personal data.”
The complainants also published Google’s category list of online ad auctions, which includes “eating disorder”, “left-wing politics” and “Scientology”. They said that there were hundreds of sensitive categories in the IAB and Google’s list.
The complainants said ad auction companies broadcast intimate profiles about UK internet users around 164 times per day, according to The New Economics Foundation’s estimate.
“These profiles are received by thousands of companies, and there is no control over what happens to them”.
Dr. Johnny Ryan, chief policy & industry relations officer of Brave, said: “Ad auction companies can fix this by simply excluding personal data, including their tracking IDs, from bid requests.
“If the industry makes some minor changes then ad auctions can safely operate outside the scope of the GDPR. This would protect privacy, but would also protect marketers and publishers from very significant risk.”
The complainants say tracking IDs and other personally specific information “are not actually necessary for ad targeting, but allow you to be re-identified and profiled every day”.
The complainants are putting pressure on regulators to take action and exclude intimate user data from ad auctions companies.
Michael Veale, a technology policy researcher at University College London, said: “Actors in this ecosystem are keen for the public to think they are dealing in anonymous, or at the very least non-sensitive, data, but this simply isn’t the case.”
“Hugely detailed and invasive profiles are routinely and casually built and traded as part of today’s real-time bidding system, and this practice is treated though it’s a simple fact of life online. It isn’t: and it both needs to and can stop.”
In a statement to WIRED, a spokesperson for Google said: “We have strict policies that prohibit advertisers on our platforms from targeting individuals on the basis of sensitive categories such as race, sexual orientation, health conditions, pregnancy status, etc.
“If we found ads on any of our platforms that were violating our policies and attempting to use sensitive interest categories to target ads to users, we would take immediate action.”
The news comes a week after the French data protection watchdog, CNIL, fined Google €50 (£44m) for breaching the current data protection laws.