On October 27, 2020, the UK Information Commissioner’s Office (“ICO”) published a report following its investigation into data protection compliance in the direct marketing data broking sector, alongside its enforcement action against Experian. During the investigation, the ICO conducted audits of the direct marketing data broking businesses of the UK’s three largest credit reference agencies (“CRAs”) – Experian, Equifax and TransUnion – and found “significant data protection failures at each” that were “deeply embedded” within the businesses.
Data broking involves collecting and combining personal data from various sources and selling or providing access to this aggregated data to other organizations. The data may take the form of lists of individuals’ names and contact details, or more detailed profiles about individuals, including information such as their preferences and habits. The ICO’s investigation focused specifically on “offline” marketing services, such as postal, telephone and SMS marketing.
As part of the ICO’s work prior to the implementation of the EU General Data Protection Regulation (“GDPR”), it created a map of the trade of personal data within the UK, identifying several “hubs” through which large volumes of personal data flowed. Three such hubs were the three CRAs audited as part of this investigation.
The ICO found that the data of almost every adult in the UK was processed for direct marketing services by at least one of the CRAs, including data provided for statutorily-required credit referencing purposes. In addition, that data was, at times, used to generate new information about individuals, which the ICO stated can be privacy invasive. This data was used by commercial organizations, political parties and charities, and generally, the relevant data subjects were unaware of this processing.
The ICO commented, “The data broking sector provides a valuable service to support organisations across the UK. Products designed for marketing purposes can have a utility beyond merely sending people promotional material, and are sometimes used to help organisations including charities, health bodies and police forces to target resource to a particular area. But the sector does this by processing large amounts of people’s data, often to profile them, and with typically no direct relationship with those people whose information it relies on.”
In particular, the ICO’s report focused on the transparency involved in the processing, the appropriate lawful basis for processing, and the use of credit reference data for direct marketing. The ICO’s key findings were that:
- The CRAs’ privacy information did not clearly explain the processing that was taking place, resulting in a lack of transparency. The information was not sufficiently prominent and did not clearly explain how the data was collected, from where it was sourced, how it was processed, or how it was sold.
- With regard to their direct marketing services, the CRAs did not provide the information required under the GDPR’s Article 14 to data subjects, incorrectly relying on the exemption permitted by Article 14 where the individual already has the relevant information or where providing it would involve disproportionate effort. The CRAs relied on the privacy notices of the third parties who supplied personal data to them, which did not clearly draw attention to the processing carried out by the CRAs for direct marketing purposes. This resulted in “invisible processing,” which the ICO considered likely not fair and not within an individual’s reasonable expectations. The absence of the Article 14 information also made it difficult for individuals to exercise their rights under the GDPR. The ICO commented that this lack of awareness also prevented it from relying on its usual indicators of public opinion, such as the number of complaints made. In response to the CRAs’ contention that providing a privacy notice would involve disproportionate effort (given the large volume of personal data held and the costs that would be incurred), the ICO noted that, “very large numbers of individuals cannot be the deciding factor against it being proportional to notify people about the processing,” since this would provide a “perverse incentive” for organizations to collect as much data as possible.
- Personal data collected for credit referencing purposes was being used for limited direct marketing purposes, without informed consent from individuals. The ICO commented that, “the CRAs’ pivotal role in the financial sector puts them in a position of trust and this brings responsibilities,” meaning that the CRAs were held to a high standard of accountability, transparency and fairness. Using personal data that was collected for statutory credit referencing purposes for secondary direct marketing purposes was not considered fair or appropriate.
- The consents relied on by Equifax, generally obtained by third parties on Equifax’s behalf, were not valid under the GDPR. These consents were neither informed nor specific.
- With respect to direct marketing services, the legitimate interest assessments that were carried out had not been correctly weighted. They gave little weight to the fact that large amounts of personal data were processed in highly targeted ways, that individuals were being profiled and that the processing lacked transparency.
- In some instances, personal data obtained on the basis of consent was then processed in reliance on the legitimate interest legal basis. The ICO confirmed that where data is collected or shared for the purposes of direct marketing on the basis of consent, the appropriate lawful basis for subsequent processing for direct marketing purposes will also be consent. It also commented that the degree of control and the nature of the relationship with the individual would be misrepresented and the right to withdraw consent would be undermined where a change in legal basis took place. This, in turn, would inevitably tip the balance of the legitimate interest balancing test against the CRA. The ICO required Experian to delete any data supplied to it on the basis of consent that it subsequently processed on the basis of legitimate interests.
Equifax and TransUnion have since changed their practices at the ICO’s request, including by withdrawing certain products and services from the market, although they did not accept that their practices were in breach of data protection legislation. Experian was judged to have improved its compliance but the ICO considered its processing of personal data in the context of marketing services to be non-compliant. The ICO issued an Enforcement Notice as an “effective and proportionate” way to ensure Experian brings its remaining practices into compliance. Experian has stated that it will appeal the Enforcement Notice.
Separately, the ICO is investigating the data processing activities of participants in the online advertising (adtech) industry and continuing its investigations into three other large data brokers. In addition, the ICO is conducting a criminal investigation into the trade of personal data obtained unlawfully from the motor accident repair sector and sold to claims management companies, and considering potential offences under the Data Protection Act 1998, the Computer Misuse Act 1990 and conspiracy to commit both offenses. The ICO is also engaged in updating two codes of practice with relevance to data broking under the Data Protection Act 2018: the data sharing code and the direct marketing code. These codes have not yet been submitted to the Secretary of State.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- U.S. State Privacy
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cyber Attack
- Cybersecurity
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- Disclosure
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition
- Facial Recognition Technology
- FACTA
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HIPPA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Legislature
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Markus Heyder
- Maryland
- Massachusetts
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Online Behavioral Advertising
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Paul Tiao
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- Telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- WeProtect Global Alliance
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code