On February 6, 2018, Working Party 29 (WP29) adopted the Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (wp251rev.01).
Advances in the capabilities of big data analytics, as well as the widespread availability of personal data on the internet and from Internet of Things (IoT) devices can allow aspects of an individual’s interests to be analyzed and predicted. Regulation 2016/679, the s.c. General Data Protection Regulation (GDPR) addresses the risks of profiling and automated individual decision-making to protect individuals’ rights and freedoms.
The document covers:
- definitions of profiling and automated decision-making and the GDPR approach to these in general – Chapter II;
- general provisions on profiling and automated decision-making, which include the applicable data protection principles such as 1. Article 5(1) (a) – Lawful, fair and transparent; Article 5(1) (b) Further processing and purpose limitation; Article 5(1) (c) Data minimization; Article 5(1) (d) Accuracy; Article 5(1) (e) Storage limitation – Chapter III;
- specific provisions on solely automated decision-making defined in Article 22 – Chapter IV;
- children and profiling – Chapter V; and
- data protection impact assessments and data protection officers– Chapter VI.
Companies will have to implement the principles of privacy by design and privacy by default and minimize the use of data.
It is interesting to notice that in general the GDPR prohibits fully automated individual decision-making, including profiling that has a legal or similarly significant effect.
For example, if someone routinely applies automatically generated profiles to individuals without any actual influence on the result, this would be a decision based solely on automated processing, and therefore it would be prohibited.
The GDPR does not define ‘legal’ or ‘similarly significant’. However, the Guidelines explain that a decision – based on solely automated processing – that affects someone’s legal rights, such as the freedom to associate with others, vote in an election, or take legal action falls within the category. A legal effect may also be something that affects a person’s legal status or their rights under a contract.
Even if a decision-making process does not have an effect on people’s legal rights it could still be similarly affecting an individual and fall within the scope of Article 22, GDPR. It is difficult to be precise about what would be considered “sufficiently significant” to meet the standards of Article 22, GDPR. However, “even where there is no change in their legal rights or obligations, the data subject could still be impacted sufficiently to require the protections under this provision.” A typical example of this type of consequence is the ‘automatic refusal of an online credit application’ or ‘e-recruiting practices without any human intervention’. Recital 71, GDPR.
In the context of automated decisions-making, controllers and processors will have to clearly inform users about profiling and give them the right to know what data and which categories of personal data have been used. The processing shall be lawful, fair and transparent, Article 5(1) (a), GDPR.
Providing data subjects with information means informing them about intended or future processing, and how the automated decision-making might affect the data subject. For example, in the context of credit scoring, the data subject should be entitled to know the logic underpinning the processing of her data resulting in a yes or no decision, and not simply information on the decision itself.
As usual, Wp29 provides good practical tips by including in the annexes a list of good practice recommendations to assist data controllers in meeting the requirements on profiling and automated decision making.
The second Annex to the Guidelines lists all GDPR provisions that reference automated decision-making as defined in Article 22, GDPR.
Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (wp251rev.01) are available at http://ec.europa.eu
Originally published at Technethics on March 2018