Automated decision making and profiling

Here are some key highlights/takeaways from the Article 29 Working Party (now the European Data Protection Board) (EDPB) guidance on automated decision-making and profiling:

  1. Profiling and automated decision-making (whether or not this includes profiling) must not be used in way that has “an unjustified impact on individuals’ rights” and certain protections have been built into the GDPR “for example (i) specific transparency and fairness requirements (ii) greater accountability obligations (iii) specified legal bases for the processing (iv) rights for individuals to oppose profiling and specifically profiling for marketing and (v) if certain conditions are met, a need to carry out a data protection impact assessment”.
  2. Profiling and automated decision-making (whether or not this includes profiling) can (i) be opaque; (ii) perpetuate existing stereotypes and social segregation; (iii) undermine a person’s freedom by locking them into a specific category and restrict their preferences; and (iv) in some cases lead to inaccurate predications, denial of services/goods and unjustified discrimination. The EU GDPR aims to address these risks, but at the same time permits domestic laws to be introduced to restrict individuals’ rights and controllers’ obligations regarding profiling and automated decision-making (provided such restrictions respect the essence of an individual’s fundamental rights and freedoms, and is a necessary and proportionate measure in a democratic society).
  3. What is profiling?Profiling is defined in the EU GDPR as “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”.
  4. Profiling is about gathering information about one or more individuals, analysing their characteristics or behaviour patterns, placing them into certain categories or groups and/or making predictions or assessments about e.g. their ability to perform a task, interests or likely behaviour. It is composed of three elements (i) it must be an automated form of processing; (ii) it must be carried out on personal data; and (iii) its objective must be to evaluate personal aspects about an individual.
  5. What is automated decision-making? Under the EU GDPR, an individual has the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her (Article 22(1)). There are exceptions to this right and measures must be employed to safeguard an individual’s rights and freedoms and legitimate interests. For example, safeguards such as the requirement for individuals to be informed, specifically meaningful information about the logic involved and the significance and envisaged consequences for the individual (Articles 13 and 14), as well as the right to obtain human intervention and the right to challenge the decision (Article 22(3)).  
  6. Automated decision-making is a right not to be subject to a decision based solely on automated decision-making unless an exception applies, as opposed to a right for the individual concerned to object.
  7. You cannot avoid Article 22 by “fabricating human involvement” and “to qualify as human intervention, the controller must ensure that any oversight of the decision is meaningful, rather than just a token gesture. It should be carried out by someone who has the authority and competence to change the decision”.
  8. Automated decision-making has a different scope and may partially overlap with profiling. Solely automated decision-making is the ability to make decisions by technological means without human involvement. Automated decisions can be made with or without profiling; profiling can take place without making automated decisions. However, profiling and automated decision-making are not necessarily separate activities. Something that starts off as a simple automated decision-making process could become one based on profiling, depending upon how the data is used. Decisions that are not wholly automated might also include profiling (e.g., before granting a mortgage, a bank may consider the credit score of the borrower, with additional meaningful intervention carried out by humans before any decision is applied to an individual).
  9. So, potentially, profiling may be used in the following ways (i) general profiling; (ii) decision-making based on profiling (e.g. a human decides whether to agree the loan based on a profile produced by purely automated means); and (iii) solely automated decision-making, including profiling (Article 22) (e.g. an algorithm decides whether the loan is agreed, and the decision is automatically delivered to the individual, without any meaningful human input).
  10. What does “legal effects” mean? The EU GDPR doesn’t define this term. The EDPB states that a processing activity will have a “legal effect” if it has an impact on “someone’s legal rights, such as the freedom to associate with others, vote in an election, or take legal action. A legal effect may also be something that affects a person’s legal status or their rights under a contract …”. For example, automated decisions that mean someone’s contact is cancelled, or they are refused entry to a country or entitlement to a particular social benefit.
  11. What does “similarly significant effects” mean? The Eu GDPR also doesn’t define this term. An automated processing activity that doesn’t have an impact on an individual’s legal rights could still fall within the scope of Article 22 if it produces an effect that is equivalent or similarly significant in its impact. The EDPB highlights that “for data processing to significantly affect someone the effects of the processing must be sufficiently great or important to be worthy of attention. In other words, the decision must have the potential to significantly affect the circumstances, behaviour or choices of the individuals concerned, have a prolonged or permanent impact on the data subject or at its most extreme, lead to the exclusion or discrimination of individuals”. The EDPB notes that it is difficult to be precise about what would be considered sufficiently significant to meet the threshold.
  12. In relation to online advertising and its reliance on automated tools/automated decision-making, the EDPB notes that most typical targeted advertising campaigns don’t have a significant effect on individuals. “However, it is possible that it may do, depending upon the particular characteristics of the case, including the intrusiveness of the profiling process; the expectations and wishes of the individuals concerned; the way the advert is delivered; or using knowledge of the vulnerabilities of the data subjects targeted. Processing that might have little impact on individuals generally may in fact have a significant effect on certain groups of society, such as minority groups or vulnerable adults. For example, someone in financial difficulties who is regularly targeted with adverts for high interest loans may sign up for these offers and potentially incur further debt”. 
  13. Even if the advertising activity is not a significant automated decision, controllers must still comply with the general rules on profiling under EU GDPR (see below).
  14. When can solely automated decision-making be used? It can only be used if one of the following exceptions apply (i) necessary for the performance of or entering into a contract; (ii) authorised by EU or UK law (which the controller is subject) provided the law lays down suitable measures to safeguard an individual’s rights and freedoms and legitimate interests); or (iii) the individual has given explicit consent.
  15. Automated decision-making which involves data relating to health, religion etc, can only be used with explicit consent or where “processing necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and interests of the data subject”.
  16. Performance of a contract. Businesses may use automated decision-making for a number of reasons including because it “potentially allow[s] for greater consistency or fairness in the decision-making process (e.g. by reducing the potential for human error, discrimination and abuse of power); reduce[s] the risk of customers failing to meet payments for goods or services (for example by using credit referencing); or enable[s] them to deliver decisions within a shorter time frame and improves efficiency. Routine human involvement can sometimes be impractical or impossible due to the sheer quantity of data being processed”. The EDPB makes it clear that these reasons alone aren’t sufficient to show that automated decision processing is necessary for entering into, or the performance of, a contract. Necessity must be interpreted narrowly and if there is a less privacy intrusive measure that can be used to achieve the same goal, “then the profiling would not be ‘necessary’”.
  17. Authorised by EU or UK law. This could include using automated decision-making for monitoring and preventing fraud and tax-evasion or to ensure the security and reliability of a service provided by the controller.
  18. All other general principles apply to profiling and automated decision-making including the controller must (i) provide concise, transparent, intelligible and easily accessible information about the processing; (ii) bring details of the right to object to profiling explicitly to the individual’s attention, and present it clearly and separately from other information; (iii) identify its lawful basis for profiling/automated decision-making; (iv) stop profiling if an individual objects to it, even when there is no automated decision-making; (v) not use personal data to then profile an individual unless such profiling is compatible with the purposes for which the personal data was originally collected; (vi) only collect personal data that it actually needs and must not retain it for longer than it actually needs to; and (vii) ensure that the personal data is accurate otherwise any resulting decision will be flawed.
  19. Children. Significant automated decision-making (including profiling) should not concern children but this is only a restriction in the recitals of the EU GDPR, as opposed to the main text. The EDPB doesn’t therefore consider this an absolute prohibition “however, in the light of this recital” it recommends that “as a rule, controllers should not rely upon the exceptions in 22(2) to justify it”.

If you would like any further information or advice on automated decision making and/or profiling, or the UK GDPR generally, please contact us.

Disclaimer: This article is provided for information purposes only and does not constitute legal advice. Professional legal advice should be obtained before taking or refraining from taking any action as a result of the contents of this article.