Article 29 Working Party Releases Guidelines on Automated Individual Decision-Making and Profiling
Time 4 Minute Read

On October 17, 2017, the Article 29 Working Party (“Working Party”) issued Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (the “Guidelines”). The Guidelines aim to clarify the EU General Data Protection Regulation’s (“GDPR’s”) provisions that address the risks arising from profiling and automated decision-making.

The Guidelines are divided into five sections, outlined below, and these are followed by best practice recommendations intended to assist controllers in meeting the GDPR requirements on profiling and automated decision-making:

  1. Definitions of profiling and automated decision-making, and the GDPR’s approach to these concepts;
  2. Specific provisions on automated decision-making as defined in Article 22 of the GDPR;
  3. General provisions on profiling and Automated decision-making;
  4. Children and profiling; and
  5. Data protection impact assessments.

Key takeaways from the Guidelines include:

  • Profiling means gathering information about an individual (or a group of individuals) and analyzing their characteristics or behavior patterns to place them into a certain category or group, and/or to make predictions or assessments (e.g., about their ability to perform a task, interests or likely behavior).
  • There is a prohibition on fully automated individual decision-making, including profiling that has a legal or similarly significant effect, but there are exceptions to the rule. There should be measures in place to safeguard the data subject’s rights, freedoms and legitimate interests.
  • When engaging in automated decision-making under the Article 22(2)(a) exception (necessary for the performance of a contract), necessity should be interpreted narrowly. The controller must be able to show the profiling is necessary, taking into account whether a less privacy-intrusive method could be adopted.
  • The Working Party clarifies that with respect to providing meaningful information about the logic involved in automated decision-making, the controller should find simple ways to tell the data subject about the rationale behind, or the criteria relied on, in reaching the decision without necessarily always attempting a complex explanation of the algorithms used or disclosure of the full algorithm. The information provided should, however, be meaningful to the data subject.
  • Providing data subjects with information about the significance and envisioned consequences of processing surrounding automated decision-making means that information must be provided about intended or future processing, and how the automated decision-making might affect the data subject. For example, in the context of credit scoring, they should be entitled to know the logic underpinning the processing of their data and resulting in a yes or no decision, and not simply information on the decision itself.
  • “Legal Effect” means processing activity that has an impact on someone’s legal rights or affects a person’s legal status, or their rights under a contract.
  • Regarding the meaning of the phrase “similarly significantly affects him or her,” the threshold for significance must be similar to a legal effect, whether or not the decision has a legal effect. The effects of processing must be more than trivial and must be sufficiently great or important to be worthy of attention.
  • To qualify as human intervention, the controller must ensure that any oversight of the decision is meaningful, rather than just a token gesture. It should be carried out by someone who has the authority and competence to change the decision. The review should undertake a thorough assessment of all the relevant data, including any additional information provided by the data subject.
  • The Working Party does not consider Recital 71 to be an absolute prohibition on solely automated decision-making relating to children, but notes that it should only be carried out in certain circumstances (e.g., to protect a child’s welfare).
  • Carrying out Data Protection Impact Assessments in the case of a systematic and extensive evaluation of personal aspects based on automated processing, including profiling, and on which decisions are based that produce legal effects or similarly significant effects, are not limited to “solely” automated processing/decisions.

The Working Party will accept comments on the guidelines until November 28, 2017.


Subscribe Arrow

Recent Posts




Jump to Page