Data Advocacy for European Union

EU data protection legislation has faced huge change. Data protection laws are built on fundamental rights enshrined in the Charter of Fundamental Rights of the European Union which are the core building blocks of the EU’s legal regime. Privacy issues arising from an exponential growth in consumer and mobile technologies, an increasingly connected planet and mass cross border data flows have pushed the EU to entirely rethink its data protection legislation to ensure that these fundamental rights are fully protected in today’s digital economy.

GDPR applies to processing of personal data “in the context of the activities of an establishment” (Article 3(1)) of any organization within the EU. For these purposes “establishment” implies the “effective and real exercise of activity through stable arrangements” (Recital 22) and “the legal form of such arrangements…is not the determining factor” (Recital 22), so there is a wide spectrum of what might be caught from fully functioning subsidiary undertakings on the one hand, to potentially a single individual sales representative depending on the circumstances.

Europe’s highest court, the Court of Justice of the European Union (the CJEU) has been developing jurisprudence on this concept, recently finding (Google Spain SL, Google Inc. v AEPD, Mario Costeja Gonzalez (C-131/12)) that Google Inc with EU based sales and advertising operations (in that particular case, a Spanish subsidiary) was established within the EU. More recently, the same court concluded (Weltimmo v NAIH (C-230/14)) that a Slovakian property website was also established in Hungary and therefore subject to Hungarian data protection laws.

Data protection laws in many cases substantively very different among Member States

European data protection laws used to be in many cases substantively very different among Member States. This was partly due to the ambiguities in the former Directive being interpreted and implemented differently, and partly due to the former Directive permitting Member States to implement different or additional rules in some areas. As GDPR became law without the need for any secondary implementing laws, there is a greater degree of harmonization relative to the previous regime. However, GDPR preserves the right for Member States to introduce different laws in many important areas and as a result we continue to see a patchwork of different data protection laws among Member States, for certain types of processing.

Each Member State is permitted to restrict the rights of individuals and transparency obligations (Article 23) by legislation when the restriction “respects the essence of fundamental rights and freedoms and is a necessary and proportionate measure in a democratic society” to safeguard one of the following:

(a) national security
(b) defense
(c) public security
(d) the prevention, investigation, detection or prosecution of breaches of ethics for regulated professions, or crime, or the execution of criminal penalties
(e) other important objectives of general public interest of the EU or a Member State, in particular economic or financial interests
(f) the protection of judicial independence and judicial proceedings
(g) a monitoring, inspection or regulatory function connected with national security, defense, public security, crime prevention, other public interest or breach of ethics
(h) the protection of the data subject or the rights and freedoms of others
(i) the enforcement of civil law claims

Under GDPR, processors are also directly required to comply with a number of specific obligations, including to maintain adequate documentation (Article 30), implement appropriate security standards (Article 32), carry out routine data protection impact assessments (Article 32), appoint a data protection officer (Article 37), comply with rules on international data transfers (Chapter V) and cooperate with national supervisory authorities (Article 31). These are in addition to the requirement for controllers to ensure that when appointing a processor, a written data processing agreement is put in place meeting the requirements of GDPR (Article 28). Again, these requirements have been enhanced and gold-plated compared to the previously applicable requirements in the Directive.

Processors are directly liable to sanctions (Article 83) if they fail to meet these criteria and may also face private claims by individuals for compensation (Article 79).

  1. GDPR completely changes the risk profile for suppliers processing personal data on behalf of their customers. Suppliers face the threat of revenue based fines and private claims by individuals for failing to comply with GDPR. Telling an investigating supervisory authority that you are just a processor won’t work; they can fine you too. Suppliers need to take responsibility for compliance and assess their own compliance with GDPR. In many cases this requires the review and overhaul of current contracting arrangements to ensure better compliance. The increased compliance burden and risk will require a careful review of business cases.
  2. Suppliers will need to decide for each type of processing undertaken whether they are acting solely as a processor or if their processing crosses the line and renders them a data controller or joint controller, attracting the full burden of GDPR.
  3. Customers (as controllers) face similar challenges. Supply chains need to be reviewed and assessed to determine current compliance with GDPR. Privacy impact assessments will need to be carried out. Supervisory authorities may need to be consulted. In many cases contracts are likely to need to be overhauled to meet the requirements of GDPR. These negotiations will not be straightforward given the increased risk and compliance burden for suppliers.
  4. There are opportunities for suppliers to offer GDPR “compliance as a service” solutions, such as secure cloud solutions, though customers need to review these carefully to ensure they dovetail to their own compliance strategy.

Processing shall be lawful only if and to the extent that at least one of the following applies:

    1. the data subject has given consent to the processing of his or her personal data for one or more specific purposes;
    2. processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
    3. processing is necessary for compliance with a legal obligation to which the controller is subject;
    4. processing is necessary in order to protect the vital interests of the data subject or of another natural person;
    5. processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
    6. processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.
  1. Where processing is based on consent, the controller shall be able to demonstrate that the data subject has consented to processing of his or her personal data.
  2. 1If the data subject’s consent is given in the context of a written declaration which also concerns other matters, the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. 2Any part of such a declaration which constitutes an infringement of this Regulation shall not be binding.
  3. 1The data subject shall have the right to withdraw his or her consent at any time. 2The withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal. 3Prior to giving consent, the data subject shall be informed thereof. 4It shall be as easy to withdraw as to give consent.
  4. When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.

Personal data is defined as “any information relating to an identified or identifiable natural person”. (Article 4) A low bar is set for “identifiable” – if anyone can identify a natural person using “all means reasonably likely to be used” (Recital 26) the information is personal data, so data may be personal data even if the organization holding the data cannot itself immediately identify a natural person. A name is not necessary either – any identifier will do, such as an identification number, location data, an online identifier or other factors which may identify that natural person.

In other words, any information that is clearly about a particular person. In certain circumstances, this could include anything from someone’s name to their physical appearance.

In its most basic definition, sensitive data is a specific set of “special categories” that must be treated with extra security. These categories are:

  • Racial or ethnic origin;
  • Political opinions;
  • Religious or philosophical beliefs;
  • Trade union membership;
  • Genetic data; and
  • Biometric data (where processed to uniquely identify someone).

These broadly follow the existing regime set out in the Directive though some additional information must be disclosed and there is no longer a right for controllers to charge a fee, with some narrow exceptions. Information requested by data subjects must be provided within one month as a default with a limited right for the controller to extend this period for up to three months.


Data subjects continue to enjoy a right to require inaccurate or incomplete personal data to be corrected or completed without undue delay.


This forerunner of this right made headlines in 2014 when Europe’s highest court ruled against Google (Judgment of the CJEU in Case C-131/12), in effect requiring Google to remove search results relating to historic proceedings against a Spanish national for an unpaid debt on the basis that Google as a data controller of the search results had no legal basis to process that information.

The right to be forgotten now has its own Article in GDPR. However, the right is not absolute; it only arises in quite a narrow set of circumstances notably where the controller has no legal ground for processing the information. As demonstrated in the Google Spain decision itself, requiring a search engine to remove search results does not mean the underlying content controlled by third party websites will necessarily be removed. In many cases the controllers of those third party websites may have entirely legitimate grounds to continue to process that information, albeit that the information is less likely to be found if links are removed from search engine results.

The practical impact of this decision has been a huge number of requests made to search engines for search results to be removed raising concerns that the right is being used to remove information that it is in the public interest to be accessible.


Data subjects enjoy a right to restrict processing of their personal data in defined circumstances. These include where the accuracy of the data is contested; where the processing is unlawful; where the data is no longer needed save for legal claims of the data subject, or where the legitimate grounds for processing by the controller and whether these override those of the data subject are contested.


This is an entirely new right in GDPR and has no equivalent in the previous Directive. Where the processing of personal data is justified either on the basis that the data subject has given their consent to processing or where processing is necessary for the performance of a contract, or where the processing is carried out be automated means, then the data subject has the right to receive or have transmitted to another controller all personal data concerning them in a structured, commonly used and machine-readable format.

The right is a good example of the regulatory downsides of relying on consent or the performance of a contract to justify processing – they come with various baggage under GDPR relative to other justifications for processing.

Where the right is likely to arise, controllers need to have procedures in place to facilitate the collection and transfer of personal data when requested to do so by data subjects.


The previous Directive’s right to object to the processing of personal data for direct marketing purposes at any time was retained.

In addition, data subjects have the right to object to processing which is legitimized on the grounds either of the legitimate interests of the data controller or where processing is in the public interest. Controllers will then have to suspend processing of the data until such time as they demonstrate “compelling legitimate grounds” for processing which override the rights of the data subject or that the processing is for the establishment, exercise or defense of legal claims.


This right expands the right not to be subject to automated decision-making that already existed under the Directive. GDPR expressly refers to profiling as an example of automated decision-making. Automated decision-making and profiling “which produces legal effects concerning [the data subject] … or similarly significantly affects him or her” are only permitted where

(a) necessary for entering into or performing a contract
(b) authorized by EU or Member State law, or
(c) the data subject has given their explicit (ie opt-in) consent.

The scope of this right is potentially extremely broad and may throw into question legitimate profiling for example, to detect fraud and cybercrime. It also presents challenges for the online advertising industry and website operators, who will need to revisit consenting mechanics to justify online profiling for behavioral advertising. This is an area where further guidance is needed on how Article 22 will be applied to specific types of profiling.

The following personal data is considered ‘sensitive’ and is subject to specific processing conditions:

  • personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs;
  • trade-union membership;
  • genetic data, biometric data processed solely to identify a human being;
  • health-related data;
  • data concerning a person’s sex life or sexual orientation.

Your company/organisation can only process sensitive data if one of the following conditions is met:

  • the explicit consent of the individual was obtained (a law may rule out this option in certain cases);
  • an EU or national law or a collective agreement, requires your company/organisation to process the data to comply with its obligations and rights, and those of the individuals, in the fields of employment, social security and social protection law;
  • the vital interests of the person, or of a person physically or legally incapable of giving consent, are at stake
  • you are a foundation, association or other not-for-profit body with a political, philosophical, religious or trade union aim, processing data about its members or about people in regular contact with the organisation;
  • the personal data was manifestly made public by the individual;
  • the data is required for the establishment, exercise or defence of legal claims
  • the data is processed for reasons of substantial public interest on the basis of EU or national law;
  • the data is processed for the purposes of preventive or occupational medicine, assessment of the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment, or the management of health or social care systems and services on the basis of EU or national law, or on the basis of a contract as a health professional;
  • the data is processed for reasons of public interest in the field of public health on the basis of EU or national law;
  • the data is processed for archiving, scientific or historical research purposes or statistical purposes on the basis of EU or national law.

International transfers and particularly those to the US have regularly made front page headline news over the years with the successful torpedoing of the EU/US Safe Harbor regime by Europe’s highest court. Organizations will be relieved to hear that for the most part GDPR did not make any material changes to the previous rules for transfers of personal data cross-border, largely reflecting the previous regime under the Directive. That said, in contrast to the previous regime where sanctions for breaching transfer restrictions are limited, failure to comply with GDPR’s transfer requirements attract the highest category of fines of up to 20 million Euros or in the case of undertakings up to 4% of annual worldwide turnover.

Transfers of personal data to third countries outside the EU are only permitted where the conditions laid down in GDPR are met (Article 44)

Transfers to third countries, territories or specified sectors or an international organization which the Commission has decided ensures an adequate level of protection do not require any specific authorization (Article 45(1)). The adequacy decisions made under the previous Directive shall remain in force under GDPR until amended or repealed (Article 45(9)); so for the time being transfers to any of the following countries are permitted: Andorra, Argentina, Canada (with some exceptions), Switzerland, Faero Islands, Guernsey, Israel, Isle of Man, Jersey, Eastern Republic of Uruguay and New Zealand.

One of the most profound changes introduced by GDPR is a European wide requirement to notify data breaches to supervisory authorities and affected individuals.

In the US, data breach notification laws are now in force in 47 States and the hefty penalties for failing to notify have fundamentally changed the way US organizations investigate and respond to data incidents. Not notifying has become a high risk option.

GDPR requires “the controller without undue delay, and where feasible, not later than 72 hours after having become aware of it, [to] notify the … breach to the supervisory authority” (Article 33(1)). When the personal data breach is likely to result in a high risk to the rights and freedoms of individuals the controller is also required to notify the affected individuals “without undue delay” (Article 34). Processors are required to notify the controller without undue delay having become aware of the breach (Article 33(2)).

Yes, individuals should not be subject to a decision that is based solely on automated processing (such as algorithms) and that is legally binding or which significantly affects them.

A decision may be considered as producing legal effects when the individual’s legal rights or legal status are impacted (such as their right to vote for example). In addition, processing can significantly affect an individual if it influences their personal circumstances, their behaviour or their choices (for example an automatic processing may lead to the refusal of an online credit application).

The use of automated processing for decision-making is authorised only in the following cases:

  • the decision based on the algorithm is necessary (i.e. there must be no other way to achieve the same goal) to enter into or to perform a contract with the individual whose data your company/organisation processed via the algorithm (for example an online loan application)
  • a particular EU or national law allows the use of algorithms and provides for suitable safeguards to protect the individual’s rights, freedoms and legitimate interests (for example anti-tax evasion regulations);
  • the individual has explicitly given his consent to a decision based on the algorithm.

However, the decision taken needs to protect the individual’s rights, freedoms, and legitimate interests, by implementing suitable safeguards. Except where such decision-making is based on a law, the individual must be at least informed of (i) the logic involved in the decision-making process, (ii) their right to obtain human intervention, (iii) the potential consequences of the processing and (iv) their right to contest the decision. Your company/organisation must therefore make the required procedural arrangements to allow the individual to express their point of view and to contest the decision.

Finally, particular attention should be given if the algorithm uses special categories of personal data: automated decision-making is only allowed in the following circumstances:

  • the individual has given their explicit consent; or
  • the processing is necessary for reasons of substantial public interest under EU or national law.

Furthermore, if the individual is a child, decisions made solely on automated processing that produce legal effects or effects which are of similar significance for the child should be avoided because children represent a more vulnerable group of society.

Newsletter mailings and e-mail marketing are a fixed part of the online marketing universe. Basically, the principle that processing is prohibited but subject to the possibility of authorisation also applies to the personal data which is used to send e-mails. Processing is only allowed by the General Data Protection Regulation (GDPR) if either the data subject has consented, or there is another legal basis. This could be, for example, preserving the legitimate interest of the controller to send e-mail marketing. Recital 47 of the General Data Protection Regulation expressly states that the law also applies to the processing of personal data for direct marketing as a legitimate interest of the controller.

Regardless of whether a company bases its marketing measures afterwards on its legitimate interest or on consent, the controller has to adhere to the data subject’s right to be informed. The content of said information depends on which justification reason is used. Please be aware that there might be certain additional national laws (e.g. competition law) which might be slightly stricter or which may impose additional restrictions.

The fines must be effective, proportionate and dissuasive for each individual case. For the decision of whether and what level of penalty can be assessed, the authorities have a statutory catalogue of criteria which it must consider for their decision.

Among other things, intentional infringement, a failure to take measures to mitigate the damage which occurred, or lack of collaboration with authorities can increase the penalties. For especially severe violations, listed in Art. 83(5) GDPR, the fine framework can be up to 20 million euros, or in the case of an undertaking, up to 4 % of their total global turnover of the preceding fiscal year, whichever is higher. But even the catalogue of less severe violations in Art. 83(4) GDPR sets forth fines of up to 10 million euros, or, in the case of an undertaking, up to 2% of its entire global turnover of the preceding fiscal year, whichever is higher.

Especially important here, is that the term “undertaking” is equivalent to that used in Art. 101 and 102 of the Treaty on the Functioning of the European Union (TFEU). According to case law of the European Court of Justice, “the concept of an undertaking encompasses every entity engaged in an economic activity, regardless of the legal status of the entity or the way in which it is financed”. An undertaking can therefore not only consist of one individual company in the sense of a legal person, but also out of several natural persons or corporate entities. Thus, a whole group can be treated as one undertaking and its total worldwide annual turnover can be used to calculate the fine for a GDPR infringement of one of its companies. In addition, each Member State shall lay down rules on other penalties for infringements of the Regulation which are not already covered by Art. 83. Those are most likely criminal penalties for certain violations of the GDPR or penalties for infringements of national rules which were adopted based on flexibility clauses of the GDPR. The national penalties must also be effective, proportionate, and act as a deterrent.