On May 23, the U.S. House Committee on Energy and Commerce Subcommittee on Data, Innovation, and Commerce approved a revised draft of the American Privacy Rights Act (“APRA”), which includes significant changes from the initial discussion draft.
On November 25, 2022, Ireland’s Data Protection Commission (“DPC”) released a decision fining Meta Platforms, Inc. (“Meta”) €265 million for a 2019 data leak involving the personal information of approximately 533 million Facebook users worldwide.
On November 21, 2022, Meta Platforms, Inc. (“Meta”) announced updated practices designed to protect the privacy of young people on Facebook and Instagram, including default privacy settings for new accounts, measures to limit unwanted interactions with adult users, and a tool to limit the spread of teens’ intimate images online.
On September 23, 2022, New York State Senator Andrew Gounardes introduced S9563, also known as the “New York Child Data Privacy and Protection Act.” The bill, which resembles the recently passed California Age-Appropriate Design Code Act, bans certain data collection and targeted advertising and requires data controllers to, among other obligations, assess the impact of their products on children.
On October 4, 2022, the White House Office of Science and Technology Policy (“OSTP”) unveiled its Blueprint for an AI Bill of Rights, a non-binding set of guidelines for the design, development, and deployment of artificial intelligence (AI) systems.
On February 2, 2022, the Litigation Chamber of the Belgian Data Protection Authority (the “Belgian DPA”) imposed a €250,000 fine against the Interactive Advertising Bureau Europe (“IAB Europe”) for several alleged infringements of the EU General Data Protection Regulation (the “GDPR”), following an investigation into IAB Europe Transparency and Consent Framework (“TCF”).
On July 27, 2021, the Spanish Data Protection Authority (the “AEPD”) imposed a €2,520,000 fine on Spanish supermarket chain Mercadona, S.A. for unlawful use of a facial recognition system.
On April 23, 2021, the Centre for Information Policy Leadership (“CIPL”) at Hunton Andrews Kurth submitted its response to the European Data Protection Board (“EDPB”) consultation on draft guidelines on virtual voice assistants (the “Guidelines”). The Guidelines were adopted on March 12, 2021 for public consultation.
On March 25, 2021, the Centre for Information Policy Leadership at Hunton Andrews Kurth organized an expert roundtable on the EU Approach to Regulating AI–How Can Experimentation Help Bridge Innovation and Regulation? (the “Roundtable”). The Roundtable was hosted by Dragoș Tudorache, Member of Parliament and Chair of the Artificial Intelligence in the Digital Age (“AIDA”) Committee of the European Parliament. The Roundtable gathered industry representatives and data protection authorities (“DPAs”) as well Axel Voss, Rapporteur of the AIDA Committee.
The concept of regulatory sandboxes has gained traction in the data protection community. Since the UK Information Commissioner’s Office (the “ICO”) completed its pilot program of regulatory sandboxes in September 2020, two European Data Protection Authorities (“DPAs”) have created their own sandbox initiatives following the ICO’s framework.
We previously posted about the Tapplock, Inc. (“Tapplock”) settlement with the Federal Trade Commission (“FTC”) over allegations that the company violated Section 5 of the FTC Act by falsely claiming that its “smart locks” were secure. Earlier this month, the FTC voted 5-0 to approve the settlement.
On May 18, 2020, the European Data Protection Board (“EDPB”) released its Annual Report (the “Report”) providing details of the EDPB’s work in 2019. This included publication of guidelines, binding decisions and general guidance on the interpretation of EU data protection law.
Elizabeth Denham, the UK Information Commissioner, has released an opinion in response to the joint effort announced by Apple Inc. (“Apple”) and Google LLC (“Google”) to enable the use of Bluetooth technology to help governments and health agencies reduce the spread of COVID-19 by building contact-tracing technology into iOS and Android smartphones. In the opinion, the Information Commissioner concludes that the "Contact Tracing Framework" (“CTF”) being developed supports data protection principles.
A Canadian maker of Internet-connected padlocks, Tapplock, Inc. (“Tapplock”), settled Federal Trade Commission (“FTC”) allegations that the company violated Section 5 of the FTC Act by falsely claiming that its “smart locks” were secure. The FTC alleged that Tapplock “did not take reasonable measures to secure its locks, or take reasonable precautions or follow industry best practices for protecting consumers’ personal information.” The FTC further alleged that Tapplock did not have a security program in place prior to security researchers discovering vulnerabilities in the design and function of the smart locks.
On March 12, 2020, the Centre for Information Policy Leadership (“CIPL”) at Hunton Andrews Kurth LLP submitted formal comments to the Office of the Privacy Commissioner of Canada (“OPC”) in response to its proposals for ensuring appropriate regulation of artificial intelligence (“AI”).
On February 1, 2020, the Italian Data Protection Authority (Garante per la protezione dei dati personali, the “Garante”) announced that it had levied a fine of €27,802,946 on TIM S.p.A. (“TIM”), a telecommunications company, for several unlawful marketing data processing practices. Between 2017 and 2019, the Garante received numerous complaints from individuals (including from individuals who were not existing customers of TIM) claiming that they had received unwanted marketing calls, without having provided their consent or despite having registered on an opt-out list. The Garante indicated that the violations impacted several million individuals.
On November 9, 2018, Serbia’s National Assembly enacted a new data protection law. The Personal Data Protection Law, which becomes effective on August 21, 2019, is modeled after the EU General Data Protection Regulation (“GDPR”).
On October 23, 2018, the 40th International Conference of Data Protection and Privacy Commissioners (the “Conference”) released a Declaration on Ethics and Protection in Artificial Intelligence (“the Declaration”). In it, the Conference endorsed several guiding principles as “core values” to protect human rights as the development of artificial intelligence (“AI”) continues apace. Key principles include:
On March 20, 2018, the Centre for Information Policy Leadership ("CIPL") at Hunton Andrews Kurth LLP issued a factsheet outlining relevant GDPR provisions for negotiations surrounding the proposed ePrivacy Regulation (the "Factsheet").
On March 26, 2018, the Centre for Information Policy Leadership at Hunton & Williams LLP and AvePoint released its second Global GDPR Readiness Report (the “Report”), detailing the results of a joint global survey launched in July 2017 concerning organizational preparedness for implementing the EU General Data Protection Regulation (“GDPR”). The Report tracks the GDPR implementation efforts of over 235 multinational organizations, and builds on the findings of the first Global GDPR Readiness Report by providing insights on key changes in readiness levels from 2016 to 2017.
Last week, at the 39th International Conference of Data Protection and Privacy Commissioners in Hong Kong, data protection authorities from around the world issued non-binding guidance on the processing of personal data collected by connected cars (the “Guidance”). Noting the ubiquity of connected cars and the rapidity of the industry’s evolution, the officials voiced their collective concern about potential risks to consumers’ data privacy and security. The Guidance identifies as its main concern the lack of available information, user choice, data control and valid consent mechanisms for consumers to control the access to and use of their vehicle and driving-related data. Building on existing international guidelines and resolutions, the Guidance urges the automobile industry to follow privacy by design principles “at every stage of the creation and development of new devices or services.”
As previously published on the Data Privacy Laws blog, Pablo A. Palazzi, partner at Buenos Aires law firm Allende & Brea, provides the following report.
Earlier this month, the Argentine Data Protection Agency (“DPA”) posted the first draft of a new data protection bill (the “Draft Bill”) on its website. Argentina’s current data protection bill was enacted in December 2000. Argentina was the first Latin American country to be recognized as an adequate country by the European Union.
On November 9, 2016, the Centre for Information Policy Leadership (“CIPL”) at Hunton & Williams LLP and AvePoint released the results of a joint global survey launched in May 2016 concerning organizational preparedness for implementing the EU General Data Protection Regulation (“GDPR”). The GDPR replaces Directive 95/46/EC and will become applicable in May 2018.
On October 3, 2016, at the Paris Motor Show, the French Data Protection Authority ("CNIL") reported on the progress of a new compliance pack on connected vehicles. The work was launched on March 23, 2016, and should be finalized in Spring 2017.
On September 27, 2016, the French Data Protection Authority (“CNIL”) announced the adoption of two new decisions, Single Authorizations AU-052 and AU-053, that will now cover all biometric access control systems in the workplace. These two new decisions repeal and replace the previous biometric decisions adopted by the CNIL and lay down the CNIL’s new position on biometric systems used to control access to the premises, software applications and/or devices in the workplace.
On September 16, 2016, the Belgian Data Protection Authority (the “Privacy Commission”) published a 13-step guidance document (in French and Dutch) to help organizations prepare for the EU General Data Protection Regulation (“GDPR”).
The 13 steps recommended by the Privacy Commission are summarized below.
On July 5, 2016, the European Commission announced the launch of a new public-private partnership (the “Partnership”) on cybersecurity, as part of its Digital Single Market and EU Cybersecurity strategies. In this context, the European Commission released several documents, including a Commission Decision establishing a contractual arrangement of the new Partnership for cybersecurity industrial research, and a Staff Working Document on the preparation activities for the Partnership.
On May 30, 2016, the European Data Protection Supervisor (“EDPS”) released its Opinion (the “Opinion”) on the EU-U.S. Privacy Shield (the “Privacy Shield”) draft adequacy decision. The Privacy Shield was created to replace the previous Safe Harbor framework invalidated by the Court of Justice of the European Union (“CJEU”) in the Schrems decision.
On March 16, 2016, the Centre for Information Policy Leadership (“CIPL”) at Hunton & Williams LLP co-hosted a one-day workshop in Amsterdam, Netherlands, together with the Dutch Ministry of Security and Justice, to kick off CIPL’s new long-term project on the implementation of the EU General Data Protection Regulation (“GDPR”).
With the recent adoption of the EU General Data Protection Regulation (“GDPR”) and the significant changes it will require from organizations, AvePoint has joined forces with the Centre for Information Policy Leadership (“CIPL”), a global privacy policy think tank at Hunton & Williams LLP, to launch the first global survey to benchmark organizations’ readiness for the GDPR.
After much debate, the final version of the EU General Data Protection Regulation (“GDPR”) is expected to be adopted by the European Parliament this week and to take effect in early 2018. The GDPR will significantly change EU data protection law in several areas, affecting all businesses in the energy, financial, health care, real estate, manufacturing, retail, technology and transportation industries, among others. To assist in-house lawyers and privacy professionals with understanding the new GDPR and planning ahead for implementation, Hunton & Williams’ Privacy and Cybersecurity practice lawyers have released The EU General Data Protection Regulation, a Guide for In-House Lawyers covering these strategic areas:
On March 14, 2016, the UK Information Commissioner’s Office (“ICO”) published a guide, Preparing for the General Data Protection Regulation (GDPR) – 12 Steps to Take Now. The guide, which is a high-level checklist with accompanying commentary, sets out a number of points that should inform organizations’ data privacy and governance programs ahead of the anticipated mid-2018 entry into force of the GDPR.
On March 16, 2016, the Centre for Information Policy Leadership (“CIPL”) at Hunton & Williams LLP will co-host a one-day workshop in Amsterdam, Netherlands, together with the Dutch Ministry of Security and Justice, to kick off a new long-term CIPL project on the implementation of the EU General Data Protection Regulation (“GDPR”).
On June 16, 2015, the Article 29 Working Party (the “Working Party”) adopted an Opinion on Privacy and Data Protection Issues relating to the Utilization of Drones (“Opinion”). In the Opinion, the Working Party provides guidance on the application of data protection rules in the context of Remotely Piloted Aircraft Systems, commonly known as “drones.”
On April 10, 2015, the UK Information Commissioner’s Office (“ICO”) published a summary of the feedback received from its July 28, 2014 report on Big Data and Data Protection (the “Report”). The ICO plans to revise its Report in light of the feedback received on three key questions and re-issue the Report in the summer of 2015. Below are key highlights set forth in the summary, entitled Summary of feedback on Big Data and data protection and ICO response (“Summary of Feedback”).
On February 5, 2015, the Article 29 Working Party (the “Working Party”) published a letter that responds to a request of the European Commission to clarify the scope of the definition of health data in connection with lifestyle and wellbeing apps. In the annex to this letter, the Working Party identifies criteria to determine when personal data qualifies as “health data,” a special category of data receiving enhanced protection under the EU Data Protection Directive 95/46/EC (the “Directive”). The Working Party further discusses the current legal regime for the processing of such health data and provides its view on the requirements for further processing of health data for historical, statistical and scientific research under the Directive. The letter also includes the Working Party’s recommendations for the regime that should be provided in the proposed EU General Data Protection Regulation (the “Proposed Regulation”).
On January 12, 2015, the European Union Agency for Network and Information Security (“ENISA”) published a report on Privacy and Data Protection by Design - from policy to engineering (the “Report”). The “privacy by design” principle emphasizes the development of privacy protections at the early stages of the product or service development process, rather than at later stages. Although the principle has found its way into some proposed legislation (e.g., the proposed EU General Data Protection Regulation), its concrete implementation remains presently unclear. Hence, the Report aims to promote a discussion on how the principle can be implemented concretely and effectively with the help of engineering methods.
Join us at the International Association of Privacy Professionals (“IAPP”) Data Protection Congress in Brussels, November 18-20, 2014. Hunton & Williams privacy professionals will be featured speakers in the following sessions:
On October 9, 2014, the 88th Conference of the German Data Protection Commissioners concluded in Hamburg. This biannual conference provides a private forum for all German state data protection authorities (“DPAs”) and the Federal Commissioner for Data Protection and Freedom of Information to share their views on current data protection issues, discuss relevant cases and adopt resolutions aimed at harmonizing how data protection law is applied across Germany. During the conference, several resolutions concerning privacy were adopted.
On July 30, 2014, the European Commission announced two new EU standards to help users of Radio Frequency Identification (“RFID”) smart chips and systems comply with both EU data protection requirements and the European Commission’s 2009 Recommendation on RFID. Among other suggestions, the Recommendation discussed the development of a common European symbol or logo to indicate whether a product uses a smart chip. One of the new standards will provide companies with a framework for the design and display of such a logo. The logo will inform consumers of the presence of RFID chips (for example, when using electronic travel passes or purchasing items with RFID tags). The Commission reiterated that such smart chips should be deactivated by default immediately, and free of charge, at the point of sale.
In response to increasing interest in a “risk-based” approach among privacy experts, including policymakers working on the proposed EU General Data Protection Regulation, the Article 29 Working Party (the “Working Party”) published a statement on the role of a risk-based approach in data protection legal frameworks (the “Statement”).
On April 16, 2014, the Article 29 Working Party (the “Working Party”) sent a letter (the “Letter”) to Lilian Mitrou, Chair of the Working Group on Information Exchange and Data Protection (the “DAPIX”) of the Council of the European Union, to support a compromise position on the one-stop-shop mechanism within the proposed EU General Data Protection Regulation (the “Proposed Regulation”).
On February 25, 2014, the UK Information Commissioner’s Office (“ICO”) published an updated code of practice on conducting privacy impact assessments (“PIAs”) (the “Code”). The updated Code takes into account the ICO’s consultation and research project on the conduct of PIAs, and reflects the increased use of PIAs in practice.
In December 2013, the UK Information Commissioner’s Office (“ICO”) issued non-binding guidance aimed at app developers (the “Guidance”). The Guidance applies to all types of mobile devices, including smart TVs and video game consoles.
On December 2, 2013, the Federal Trade Commission announced that it will host a series of seminars to examine the privacy implications of three new areas of technology used to track, market to and analyze consumers: mobile device tracking, predictive scoring and consumer-generated health data. The seminars will address (1) businesses tracking consumers using signals from the consumers’ mobile devices, (2) the use of predictive scoring to determine consumers’ access to products and offers, and (3) consumer-generated information provided to non-HIPAA covered websites and apps. The FTC stated that the intention of the seminars is to bring attention to new trends in big data and their impact on consumer privacy.
On February 12, 2013, the UK Information Commissioner’s Office published a further analysis of the European Commission’s proposed General Data Protection Regulation (the “Proposed Regulation”). This latest analysis supplements the initial analysis paper on the Proposed Regulation published on February 27, 2012. Although the general views expressed in its initial paper stand, the ICO has now provided greater detail regarding its views of the substantive provisions of the Proposed Regulation.
On March 8, 2013, the Federal Trade Commission issued a staff report entitled Paper, Plastic… or Mobile? An FTC Workshop on Mobile Payments (the “Report”). The Report is based on a workshop held by the FTC in April 2012 and highlights key consumer and privacy issues resulting from the increasingly widespread use of mobile payments.
Although the FTC recognizes the benefits of mobile payments, such as ease and convenience for consumers and potentially lower transaction costs for merchants, the Report notes three areas of concern with the mobile payments system: (1) dispute resolution, (2) data security and (3) privacy.
On December 10, 2012, the Federal Trade Commission issued a new report, Mobile Apps for Kids: Disclosures Still Not Making the Grade, which follows up on the FTC’s February 2012 report, Mobile Apps for Kids: Current Privacy Disclosures are Disappointing. The FTC conducted a follow-up survey regarding pre-download mobile app privacy disclosures, and whether those disclosures accurately describe what occurs during use of the apps.
On October 22, 2012, the Federal Trade Commission released a report entitled “Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies.” The report focuses on privacy concerns associated with facial recognition technology, which is becoming increasingly ubiquitous across a variety of commercial applications ranging from search engines to video games to password authentication.
On September 5, 2012, the Federal Trade Commission issued guidelines for mobile app developers entitled “Marketing Your Mobile App: Get It Right from the Start.” The guidelines are largely a distillation of the FTC’s previously expressed views on a range of topics that have relevance to the mobile app space. They are summarized below:
On May 30, 2012, the Federal Trade Commission hosted a public workshop addressing the need for new guidance on advertising and privacy disclosures online and in mobile environments. During the workshop, the FTC announced that it hopes to release an updated version of its online advertising disclosure guidance this fall that would incorporate input from businesses and consumer advocates. Topics explored at the workshop included:
- Best practices for privacy disclosures on mobile platforms and how they can be short, effective and accessible to consumers;
- how to put disclosures in proximity to offers on mobile platforms;
- social media disclosures; and
- the placement of material information on webpages.
On May 3, 2012, Viviane Reding, Justice Commissioner and European Commission Vice-President, delivered a speech during the European data protection authorities’ (“DPAs’”) Spring Conference, which was held in closed sessions in Luxembourg. In her speech, Commissioner Reding discussed how the proposed EU Data Protection Regulation aimed to empower the DPAs and addressed some of the DPAs’ primary concerns with the reform.
On March 23, 2012, the Article 29 Working Party (the “Working Party”) adopted an Opinion on the European Commission’s data protection law reform proposals, including the draft Regulation that is of particular importance for businesses. The Working Party’s Opinion serves as the national data protection authorities’ contribution to the legislative process before the European Parliament and the European Council.
On March 26, 2012, the Federal Trade Commission issued a new privacy report entitled “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers.” The report charts a path forward for companies to act in the interest of protecting consumer privacy.
In his introductory remarks, FTC Chairman Jon Leibowitz indicated his support for Do Not Track stating, “Simply put, your computer is your property; no one has the right to put anything on it that you don’t want.” In later comments he predicted that if effective Do Not Track mechanisms are not available by the end of this year, the new Congress likely would introduce a legislative solution.
On January 25, 2012, the UK Information Commissioner’s Office (“ICO”) published an initial statement welcoming the European Commission’s proposed new General Data Protection Regulation (the “Proposed Regulation”), and commended the Commission’s efforts to strengthen the rights of individuals, recognize important privacy concepts such as privacy by design and privacy impact assessments, and include accountability requirements.
On January 25, 2012, the European Commission released a data protection law reform package, including its proposed General Data Protection Regulation (the “Proposed Regulation”). The UK Information Commissioner’s Office (“ICO”) has reacted positively to the Proposed Regulation, in particular commending efforts to strengthen the rights of individuals, the recognition of important privacy concepts such as privacy by design and privacy impact assessments, and new accountability requirements to ensure organizations properly demonstrate and document their data protection safeguards and procedures.
On November 2, 2011, following welcome comments by Federal Institute for Access to Information and Data Protection (“IFAI”) Commissioner Jacqueline Peschard, the 33rd International Conference of Data Protection and Privacy Commissioners opened in Mexico City with an examination of the phenomenon of “Big Data” as a definer of a new economic era. In a wide-ranging presentation, Kenneth Neil Cukier of the Economist drew into clear relief the possibilities and problems associated with combining vast stores of data and powerful analytics. He highlighted the growing ability to correlate seemingly unrelated data sets to predict behavior, reveal trends, enhance product performance and safety and derive meaning. In his remarks Cukier noted that, in an era of Big Data, much of the decision-making about data collection and use goes beyond traditional notions of privacy, touching on ethics and free will. Noting that the printing press led to the development of free speech laws, he left open the question of how Big Data may change the legal landscape.
On September 19, 2011, Privacy Piracy host Mari Frank interviewed Lisa J. Sotto, partner and head of the Global Privacy and Data Security practice at Hunton & Williams LLP, on KUCI 88.9 FM radio in Irvine, California. In the interview, Ms. Sotto discussed critical current privacy and data security issues, including lessons learned from the recent data breaches, the regulatory framework in the U.S. and EU, and expected legislative changes in the privacy arena globally.
Listen to the Privacy Piracy interview.
On June 28, 2011, the Federal Communications Commission and the Federal Trade Commission convened a public education forum entitled “Helping Consumers Harness the Potential of Location-Based Services.” Representatives of telecommunications carriers, technology companies and consumer advocacy organizations discussed technological developments and how best to realize the benefits of location-based services without compromising privacy.
On June 15, 2011, European Data Protection Supervisor (“EDPS”) Peter Hustinx gave a press conference to present his annual report for 2010. The annual report provides an overview of the EDPS’ main activities in 2010 and sets forth key priorities and challenges for the future.
In his speech, Hustinx focused primarily on the review of the EU data protection framework and the Data Retention Directive. He referenced his recent Opinion in which he concluded that the Data Retention Directive does not meet general EU data protection requirements and that the European Commission should explore the possibility of replacing it with alternative measures such as data preservation through a “quick freeze” procedure. Hustinx also stated his intention to keep a close eye on any developments with respect to RFID technology, cloud computing and online enforcement of intellectual property rights.
On April 4, 2011, the Article 29 Working Party (the “Working Party”) issued an Opinion to clarify the legal framework applicable to smart metering technology in the energy sector (the “Opinion”).
Smart meters are digital meters that record energy consumption and enable two-way remote communication with the wider network for purposes such as monitoring and billing, and to forecast energy demand. Smart meters are intended to allow the industry to better regulate energy supply, and to help individuals reduce consumption. According to the Working Party, however, the analysis and exchange of smart metering information has the potential to be privacy-invasive.
On April 12, 2011, U.S. Senators John Kerry (D-MA) and John McCain (R-AZ) introduced the Commercial Privacy Bill of Rights Act of 2011 (the “Act”) to “establish a regulatory framework for the comprehensive protection of personal data for individuals under the aegis of the Federal Trade Commission.” The bill applies broadly to entities that collect, use, transfer or store the “covered information” of more than 5,000 individuals over a consecutive 12-month period. Certain provisions of the bill would direct the FTC to initiate rulemaking proceedings within specified timeframes, but the bill also imposes requirements directly on covered entities.
On March 30, 2011, the Federal Trade Commission announced that Google agreed to settle charges that it used deceptive tactics and violated its own privacy promises to consumers when it launched its social network, Google Buzz, in 2010. According to the FTC’s complaint (main document, exhibits), Google led Gmail users to believe that they could choose whether or not they wanted to join Google Buzz. The options for declining or leaving Google Buzz, however, were ineffective. For those who joined Google Buzz, the controls for limiting the sharing of their personal information were difficult to locate and confusing. Furthermore, the FTC charged that Google violated its privacy policies by using information provided for Gmail for another purpose – social networking – without obtaining consumers’ permission in advance. Finally, the FTC alleged that Google misrepresented that it was treating personal information from the European Union in accordance with the U.S.-EU Safe Harbor framework because it failed to give consumers notice and choice before using their information for a different purpose from that for which it was collected.
The Council of the European Union (the “Council”) released its conclusions following meetings held on February 24 and 25, 2011, regarding the European Commission’s November 4, 2010 Communication proposing “a comprehensive approach on personal data protection in the European Union” which we reported on last November.
On January 17, 2011, the Centre for Information Policy Leadership at Hunton & Williams LLP (the “Centre”) released a response to the European Commission’s consultation paper, “A comprehensive approach on personal data protection in the European Union.” In its response, prepared by Richard Thomas, former UK Information Commissioner and Global Strategy Advisor of the Centre, the Centre calls for a modernized European framework for data protection that addresses the realities of the digital age.
Earlier this month, the Belgian Privacy Commission (the “Belgian DPA”) published its December 15, 2010 Recommendation on Mobile Mapping (Recommandation d’initiative en matière de Mobile Mapping, or “the Recommendation”). The Recommendation defines Mobile Mapping as “technology by which a vehicle equipped with a camera and/or a scanner can digitally record all data on a specific road, including by taking 360° photos.” The scope of the Recommendation covers not only applications such as Google Street View, but also other types of Mobile Mapping such as mapping by public authorities, mapping for tourism, real estate applications and GPS navigation mapping.
On December 1, 2010, the European Parliament hosted a Privacy Platform on the European Commission’s recent Communication proposing “a comprehensive approach on personal data protection in the European Union,” which is aimed at modernizing the current EU data protection framework.
The panel, hosted by European Parliament Member Sophie in ‘t Veld, included:
- The Head of Cabinet of the European Commission’s Commissioner for Justice, Fundamental Rights and Citizenship, Martin Selmayr (in Commissioner Viviane Reding’s absence);
- The Chairman of the Article 29 Working Party, Jacob Kohnstamm; and
- The European Data Protection Supervisor, Peter Hustinx.
The Platform was very well attended, bringing together a wide range of stakeholders from both the public and private sectors.
On December 1, 2010, the Federal Trade Commission released its long-awaited report on online privacy entitled “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers.” Observers expected the report to address the concept of privacy by design, the burdens placed on consumers to read and understand privacy notices and make privacy choices, the provision of individual access to personal data and the rights of consumers with respect to Internet tracking. The FTC report introduces a privacy framework to “establish certain common assumptions and bedrock protections on which both consumers and businesses can rely as they engage in commerce.” It includes the following elements:
David Vladeck, Director of the FTC’s Division of Consumer Protection, this morning previewed the long-awaited FTC report that sums up months of discussion regarding the future of privacy regulation in the United States and examines the viability of a Do Not Track mechanism. Vladeck indicated at the Consumer Watchdog Policy Conference that the existing privacy framework in the U.S. is not keeping pace with new technologies. In addition, he stated that the pace of industry self-regulation, while constructive, has been too slow. According to Vladeck, the report will address several major themes, including the following:
On November 4, 2010, the European Commission (the “Commission”) released a draft version of its Communication proposing “a comprehensive approach on personal data protection in the European Union” (the “Communication”) with a view to modernizing the EU legal system for the protection of personal data. The Communication is the result of the Commission’s review of the current legal framework (i.e., Directive 95/46/EC), which started with a high-level conference in Brussels in May 2009, followed by a public consultation and additional targeted stakeholders’ consultations throughout 2010. Although the Commission considers the core principles of the Directive to still be valid, the Communication equally acknowledges that the existing legal framework for data protection in the European Union is no longer able to meet the challenges of rapid technological developments and globalization.
On October 7, 2010, the French Data Protection Authority (the “CNIL”) released its first comprehensive handbook on the security of personal data (the “Guidance”). The Guidance follows the CNIL’s “10 tips for the security of your information system” issued on October 12, 2009, which were based on the CNIL’s July 21, 1981 recommendations regarding security measures applicable to information systems.
The Guidance reiterates that data controllers have an obligation under French law to take “useful precautions” given the nature of the data and the risks associated with processing the data, to ensure data security and, in particular, prevent any alteration or damage, or access by non-authorized third parties (Article 34 of the French Data Protection Act). Failure to comply with this requirement is punishable by up to five years imprisonment or a fine of €300,000.
David Vladeck, the head of the Bureau of Consumer Protection at the Federal Trade Commission, shared his vision for consumer privacy protection with an audience at the IAPP’s Privacy Academy on September 30, 2010. Mr. Vladeck began by reminding the audience that the FTC is aggressively enforcing on privacy and data security matters, having brought 29 cases to date. Where possible, the FTC joins forces with other federal regulators, such as the Department of Health and Human Services, to seek broad relief that the FTC could not otherwise get on its own. Mr. Vladeck indicated that the FTC also works closely with the states, citing a recent case in which the FTC filed concurrent settlements with 36 state attorneys general. Mr. Vladeck stated that the FTC plans to continue to bring cases to ensure that companies “reasonably” safeguard information.
Mr. Vladeck noted three key areas for future enforcement. The FTC will (1) bring more cases involving “pure” privacy, i.e., cases involving practices that attempt to circumvent consumers’ understanding of a company’s information practices and consumer choices; (2) focus enforcement efforts on new technologies (Mr. Vladeck noted that, to assist staff attorneys in bringing these sorts of cases, the FTC has hired technologists to assist and also have created mobile labs to respond to the proliferation of smart phones and mobile apps); and (3) increase international cooperation on privacy issues (Mr. Vladeck cited the FTC’s recently-announced participation in the Global Privacy Enforcement Network).
On April 19, 2010, the Privacy Commissioner of Canada, Jennifer Stoddart, and the heads of nine other international data protection authorities took part in an unprecedented collaboration by issuing a strongly worded letter of reproach to Google’s Chief Executive Officer, Eric Schmidt. The joint letter, which was also signed by data protection officials from France, Germany, Ireland, Israel, Italy, the Netherlands, New Zealand, Spain and the United Kingdom, highlighted growing international concern that “the privacy rights of the world’s citizens are being forgotten as Google rolls out new technological applications.”
In 1980, the Organization for Economic Cooperation and Development (“OECD”) first published privacy guidelines that included an accountability principle. Since that time, little work has been done to define accountability or to describe what it means for organizations to be accountable for the responsible use and protection of data. In an effort to fill that gap, The Centre for Information Policy Leadership has authored “Data Protection Accountability: The Essential Elements” which articulates the conditions organizations would have to meet to be accountable.
Search
Recent Posts
Categories
- Behavioral Advertising
- Centre for Information Policy Leadership
- Children’s Privacy
- Cyber Insurance
- Cybersecurity
- Enforcement
- European Union
- Events
- FCRA
- Financial Privacy
- General
- Health Privacy
- Identity Theft
- Information Security
- International
- Marketing
- Multimedia Resources
- Online Privacy
- Security Breach
- U.S. Federal Law
- U.S. State Law
- U.S. State Privacy
- Workplace Privacy
Tags
- Aaron Simpson
- Accountability
- Adequacy
- Advertisement
- Advertising
- American Privacy Rights Act
- Anna Pateraki
- Anonymization
- Anti-terrorism
- APEC
- Apple Inc.
- Argentina
- Arkansas
- Article 29 Working Party
- Artificial Intelligence
- Australia
- Austria
- Automated Decisionmaking
- Baltimore
- Bankruptcy
- Belgium
- Biden Administration
- Big Data
- Binding Corporate Rules
- Biometric Data
- Blockchain
- Bojana Bellamy
- Brazil
- Brexit
- British Columbia
- Brittany Bacon
- Brussels
- Business Associate Agreement
- BYOD
- California
- CAN-SPAM
- Canada
- Cayman Islands
- CCPA
- CCTV
- Chile
- China
- Chinese Taipei
- Christopher Graham
- CIPA
- Class Action
- Clinical Trial
- Cloud
- Cloud Computing
- CNIL
- Colombia
- Colorado
- Commodity Futures Trading Commission
- Compliance
- Computer Fraud and Abuse Act
- Congress
- Connecticut
- Consent
- Consent Order
- Consumer Protection
- Cookies
- COPPA
- Coronavirus/COVID-19
- Council of Europe
- Council of the European Union
- Court of Justice of the European Union
- CPPA
- CPRA
- Credit Monitoring
- Credit Report
- Criminal Law
- Critical Infrastructure
- Croatia
- Cross-Border Data Flow
- Cross-Border Data Transfer Flow
- Cyber Attack
- Cybersecurity
- Cybersecurity and Infrastructure Security Agency
- Data Brokers
- Data Controller
- Data Localization
- Data Privacy Framework
- Data Processor
- Data Protection Act
- Data Protection Authority
- Data Protection Impact Assessment
- Data Transfer
- David Dumont
- David Vladeck
- Delaware
- Denmark
- Department of Commerce
- Department of Health and Human Services
- Department of Homeland Security
- Department of Justice
- Department of the Treasury
- Department of Treasury
- Disclosure
- District of Columbia
- Do Not Call
- Do Not Track
- Dobbs
- Dodd-Frank Act
- DPIA
- E-Privacy
- E-Privacy Directive
- Ecuador
- Ed Tech
- Edith Ramirez
- Electronic Communications Privacy Act
- Electronic Privacy Information Center
- Elizabeth Denham
- Employee Monitoring
- Encryption
- ENISA
- EU Data Protection Directive
- EU Member States
- European Commission
- European Data Protection Board
- European Data Protection Supervisor
- European Parliament
- Facial Recognition
- Facial Recognition Technology
- FACTA
- Fair Information Practice Principles
- Federal Aviation Administration
- Federal Bureau of Investigation
- Federal Communications Commission
- Federal Data Protection Act
- Federal Trade Commission
- FERC
- FinTech
- Florida
- Food and Drug Administration
- Foreign Intelligence Surveillance Act
- France
- Franchise
- Fred Cate
- Freedom of Information Act
- Freedom of Speech
- Fundamental Rights
- GDPR
- Geofencing
- Geolocation
- Georgia
- Germany
- Global Privacy Assembly
- Global Privacy Enforcement Network
- Gramm Leach Bliley Act
- Hacker
- Hawaii
- Health Data
- Health Information
- HIPAA
- HITECH Act
- Hong Kong
- House of Representatives
- Hungary
- Illinois
- India
- Indiana
- Indonesia
- Information Commissioners Office
- Information Sharing
- Insurance Provider
- Internal Revenue Service
- International Association of Privacy Professionals
- International Commissioners Office
- Internet
- Internet of Things
- IP Address
- Ireland
- Israel
- Italy
- Jacob Kohnstamm
- Japan
- Jason Beach
- Jay Rockefeller
- Jenna Rode
- Jennifer Stoddart
- Jersey
- Jessica Rich
- John Delionado
- John Edwards
- Kentucky
- Korea
- Latin America
- Laura Leonard
- Law Enforcement
- Lawrence Strickling
- Legislation
- Legislature
- Liability
- Lisa Sotto
- Litigation
- Location-Based Services
- London
- Madrid Resolution
- Maine
- Malaysia
- Marketing
- Markus Heyder
- Maryland
- Massachusetts
- Mexico
- Microsoft
- Minnesota
- Mobile App
- Mobile Device
- Montana
- Morocco
- MySpace
- Natascha Gerlach
- National Institute of Standards and Technology
- National Labor Relations Board
- National Science and Technology Council
- National Security
- National Security Agency
- National Telecommunications and Information Administration
- Nebraska
- NEDPA
- Netherlands
- Nevada
- New Hampshire
- New Jersey
- New Mexico
- New York
- New Zealand
- Nigeria
- Ninth Circuit
- North Carolina
- Norway
- Obama Administration
- OECD
- Office for Civil Rights
- Office of Foreign Assets Control
- Ohio
- Online Behavioral Advertising
- Opt-In Consent
- Oregon
- Outsourcing
- Pakistan
- Parental Consent
- Paul Tiao
- Payment Card
- PCI DSS
- Penalty
- Pennsylvania
- Personal Data
- Personal Health Information
- Personal Information
- Personally Identifiable Information
- Peru
- Philippines
- Phyllis Marcus
- Poland
- PRISM
- Privacy By Design
- Privacy Policy
- Privacy Rights
- Privacy Rule
- Privacy Shield
- Protected Health Information
- Ransomware
- Record Retention
- Red Flags Rule
- Rhode Island
- Richard Thomas
- Right to Be Forgotten
- Right to Privacy
- Risk-Based Approach
- Rosemary Jay
- Russia
- Safe Harbor
- Sanctions
- Schrems
- Scott Kimpel
- Securities and Exchange Commission
- Security Rule
- Senate
- Serbia
- Service Provider
- Singapore
- Smart Grid
- Smart Metering
- Social Media
- Social Security Number
- South Africa
- South Carolina
- South Korea
- Spain
- Spyware
- Standard Contractual Clauses
- State Attorneys General
- Steven Haas
- Stick With Security Series
- Stored Communications Act
- Student Data
- Supreme Court
- Surveillance
- Sweden
- Switzerland
- Taiwan
- Targeted Advertising
- Telecommunications
- telemarketing
- Telephone Consumer Protection Act
- Tennessee
- Terry McAuliffe
- Texas
- Text Message
- Thailand
- Transparency
- Transportation Security Administration
- Trump Administration
- United Arab Emirates
- United Kingdom
- United States
- Unmanned Aircraft Systems
- Uruguay
- Utah
- Vermont
- Video Privacy Protection Act
- Video Surveillance
- Virginia
- Viviane Reding
- Washington
- WeProtect Global Alliance
- Whistleblowing
- Wireless Network
- Wiretap
- ZIP Code