Article 20*

Article 20 – Measures based on profiling

Commission Proposal

Go down to proposed amendment

1. Every natural person shall have the right not to be subject to a measure which produces legal effects concerning this natural person or significantly affects this natural person, and which is based solely on automated processing intended to evaluate certain personal aspects relating to this natural person or to analyse or predict in particular the natural person’s performance at work, economic situation, location, health, personal preferences, reliability or behaviour.

2. Subject to the other provisions of this Regulation, a person may be subjected to a measure of the kind referred to in paragraph 1 only if the processing:

(a) is carried out in the course of the entering into, or performance of, a contract, where the request for the entering into or the performance of the contract, lodged by the data subject, has been satisfied or where suitable measures to safeguard the data subject’s legitimate interests have been adduced, such as the right to obtain human intervention; or

(b) is expressly authorized by a Union or Member State law which also lays down suitable measures to safeguard the data subject’s legitimate interests; or

(c) is based on the data subject’s consent, subject to the conditions laid down in Article 7 and to suitable safeguards.

3. Automated processing of personal data intended to evaluate certain personal aspects relating to a natural person shall not be based solely on the special categories of personal data referred to in Article 9.

4. In the cases referred to in paragraph 2, the information to be provided by the controller under Article 14 shall include information as to the existence of processing for a measure of the kind referred to in paragraph 1 and the envisaged effects of such processing on the data subject.

5. The Commission shall be empowered to adopt delegated acts in accordance with Article 86 for the purpose of further specifying the criteria and conditions for suitable measures to safeguard the data subject’s legitimate interests referred to in paragraph 2.

Go to related Recital 21.

Go to related Recital 58.

 

EDRi’s proposed amendment:

Article 20 – Measures based on profiling

1. Every natural person shall have the right, both off-line and online, not to be subject to a measure which produces legal effects concerning this natural person or significantly affects this natural person, and which is based solely on automated processing intended to evaluate certain personal aspects relating to this natural person or to analyse or predict in particular the natural person’s performance at work, economic situation, location, health, personal preferences, reliability or behaviour.

2. Subject to the other provisions of this Regulation, including paragraphs (3) and (4), a person may be subjected to a measure of the kind referred to in paragraph 1 only if the processing:

(a) is carried out in the course of necessary for the entering into, or performance of, a contract, where the request for the entering into or the performance of the contract, lodged by the data subject, has been satisfied or where suitable measures to safeguard the data subject’s legitimate interests have been adduced, including the right to be provided with meaningful information about the logic used in the profiling, and the right to obtain human intervention, including an explanation of the decision reached after such intervention; or

(b) is expressly authorized by a Union or Member State law which also lays down suitable measures to safeguard the data subject’s legitimate interests, and which protects the data subjects against possible discrimination resulting from measures described in paragraph 1; or

(c) is based on the data subject’s consent, subject to the conditions laid down in Article 7 and to suitable safeguards, including effective protection against possible discrimination resulting from measures described in paragraph 1.

3. Automated processing of personal data intended to evaluate certain personal aspects relating to a natural person shall not be based solely on include or generate any data that fall under the special categories of personal data referred to in Article 9, except when falling under the exceptions listed in Article 9(2).

3a. Profiling that (whether intentionally or otherwise) has the effect of discriminating against individuals on the basis of race or ethnic origin, political opinions, religion or beliefs, trade union membership, or sexual orientation, or that (whether intentionally or otherwise) result in measures which have such effect, shall be prohibited.

3b. Automated processing of personal data intended to evaluate certain personal aspects relating to a natural person shall not be used to identify or individualize children.

4. In the cases referred to in paragraph 2, the information to be provided by the controller under Article 14 and 15 shall include information as to the existence of processing for a measure of the kind referred to in paragraph 1 and the envisaged effects of such processing on the data subject, as well as the access to the logic underpinning the data undergoing processing.

5. Within six months of the coming into force of this Regulation, the Commission shall be empowered to adopt delegated acts in accordance with Article 86 for the purpose of further specifying the criteria and conditions for suitable measures to safeguard the data subjects’ legitimate interests referred to in paragraph 2. The Commission shall consult representatives of data subjects and the Data Protection Board on its proposals before issuing them.

Justification

It should be specified that the general prohibition applies to all kinds of profiling, both online and offline. It is clear that the online environment allows for the creation of profiles of data subjects based on their behavior, through cookies, device fingerprinting or other means of gathering of user data.

While profiling is in some circles seen as a panacea for many problems, it should be noted that there is a significant body of research addressing its limitations. Notably, profiling tends to be useless for very rare characteristics, due to the risk of false positives. Also, profiles can be hard or impossible to verify. Profiles are based on complex and dynamic algorithms that evolve constantly and that are hard to explain to data subjects. Often, these algorithms qualify as commercial secrets and will not be easily provided to data subjects. However, when natural persons are subject to profiling, they should be entitled to information about the logic used in the measure, as well as an explanation of the final decision if human intervention has been obtained. This helps to reduce intransparency, which could undermine trust in data processing and may lead to loss or trust in especially online services. There is also a serious risk of unreliable and (in effect) discriminatory profiles being widely used, in matters of real importance to individuals and groups, which is the motivation behind several suggested changes in this Article that aim to improve the protection of data subjects against discrimination. In relation to this, the use of sensitive data in generating profiles should also be restricted.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

  • eu logo The launch and upkeep (until December 31, 2013) of this website received financial support from the EU's Fundamental Rights and Citizenship Programme.
%d bloggers like this: