Skip to main content

Key GDPR Guidance on Behavioral Advertising, Profiling and Automated Decision-Making

Spoiler Alert: Behavioral advertising companies will find some bad news in the guidance.

The Article 29 Working Party (WP29) advisory group, which will soon become the more transparently-named (and very powerful) European Data Protection Board, is busy drafting and issuing guidance documents to help organizations understand how European data protection authorities will interpret various requirements of the General Data Protection Regulation (GDPR).  WP29 recently issued draft guidance relating to automated decision-making and profiling that will be critical for all organizations that conduct those activities. The draft guidance is open for comments until Nov. 28, 2017.  This post recaps some of the particularly interesting aspects of the draft guidance, which can be found in full here (scroll down to the items just above the “Adopted Guidelines” section).

But first, what counts as automated decision-making under the GDPR?  And what is “profiling”?

Automated decision-making is not expressly defined in the GDPR, but as the WP29 guidance confirms, it’s pretty much what you would expect: a decision is made about an individual using technological means without substantive input from a human decision-maker.  For example, a credit card company might run credit card applications through a software application that applies an algorithm and generates a yes or no decision.  Automated decision-making could be based on a profile of the person, but it doesn’t have to be.

The GDPR defines profiling as “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements” (GDPR Art. 4(4)).  Profiling is typically associated with behavioral advertising, which is a very common form of advertising used on websites and in apps (it’s what’s going on when you have been searching online for, say, back pain remedies and suddenly ads for ergonomic desk chairs pop up on virtually every website you visit as well as in your favorite social media news feed and your weather and traffic apps).    Profiling necessarily involves some automated decision-making.

It's important to appreciate that the GDPR does not impose a blanket ban on automated decision-making or profiling.  Instead, the GDPR gives people “the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her,” subject to a few exceptions (GDPR Art. 22(1)).  So only automated decision-making that affects the person’s legal status or rights – or that is “similarly significant” in its impacts, is prohibited.  (That said, there’s a separate right to object to profiling for purposes of direct marketing, discussed further below.)

Some companies have interpreted the “legal effects” qualification as granting broad latitude to companies that are part of the profiling/advertising ecosystem that powers the Internet, financially speaking.  However, the WP29 draft guidance throws some cold water on that idea.    WP29 first notes that:

For data processing to significantly affect someone the effects of the processing must be more than trivial and must be sufficiently great or important to be worthy of attention. In other words, the decision must have the potential to significantly influence the circumstances, behaviour or choices of the individuals concerned. At its most extreme, the decision may lead to the exclusion or discrimination of individuals. 17/EN WP 251, p. 10, emphasis added.

So the key question is whether the decision-making (including profiling) has the potential to significantly influence the circumstances, behavior or choices of the individuals concerned.  And this is where the draft guidance gets very interesting for the behavioral advertising industry:

In many typical cases targeted advertising does not have a significant effect on individuals, for example an advertisement for a mainstream online fashion outlet based on a simple demographic profile: ‘women in the Brussels region’.

However it is possible that it may do, depending upon the particular characteristics of the case, including:

 - the intrusiveness of the profiling process;

- the expectations and wishes of the individuals concerned;

- the way the advert is delivered; or

- the particular vulnerabilities of the data subjects targeted.  17/EN WP 251, p. 11, emphasis added.

In other words, subjective factors can move otherwise unrestricted profiling activity into the restricted category.  WP29 seems to be implying that overly-close profiling – that sense of being “stalked” by advertisements on the web that most of us have experienced – would trigger the full restrictions on profiling and automated decision-making.  If that is in fact WP29’s intention, it would be better to state the case more clearly in the final version of the guidance.

Furthermore, to elaborate on the last bullet point, identifiable characteristics of the person who is being targeted could bring the profiling into the restricted category:

Processing that might have little impact on individuals generally may in fact have a significant effect on certain groups of society, such as minority groups or vulnerable adults. For example, someone in financial difficulties who is regularly shown adverts for on-line gambling may sign up for these offers and potentially incur further debt.  17/EN WP 251, p. 11, emphasis added.

So what does all of this mean?  If behavioral advertising triggers the “similarly significantly effects” prong of Art. 22(1), then the advertising (and related profiling) can only take place if the profiling and automated decision-making is:

  • necessary for entering into, or performance of, a contract between the data subject and a data controller
  • authorized by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or
  • is based on the data subject's explicit consent. GDPR Art. 22(2).

None of these conditions will be easy to satisfy in the context of behavioral advertising, which typically is used to advertise (not perform a contract), is unlikely to gain special consideration under national laws, and which fundamentally relies on being a “behind the scenes” process, which makes obtaining explicit consent a tough sell to the companies responsible for creating profiles and deploying ads.

It will be interesting to see whether the online advertising industry provides comments to WP29, and whether the guidance changes as a result.  In the meantime, the WP29 draft guidance is a must-read for any company that uses profiling or other forms of automated decision-making.

Subscribe To Viewpoints


Susan L. Foster, PhD is a commercial attorney based in the UK with extensive experience advising clients on EU privacy regulations and transactions in life sciences and technology. Sue is qualified as a solicitor in England & Wales and is a member of the California bar. She is also a Certified Information Privacy Professional-Europe (CIPP-E).