Not Just Watching the Dog

In Decision 21/2025, the Hellenic Data Protection Authority (HDPA) revisits a recurring misconception: that the General Data Protection Regulation (GDPR) does not apply to private households. The case involved a couple who operated a restaurant and lived on the same premises, which they monitored using a set of security cameras. A neighbouring property owner filed a complaint after discovering that at least one of the cameras recorded not only the couple’s own premises but also his adjoining land and a portion of a public street.

The HDPA reviewed the footage and found that the cameras included a rotating surveillance device with fields of view extending beyond the private domain. Despite the couple’s claim that one camera merely recorded their stable, the evidence suggested otherwise.

The Authority ruled that this type of surveillance no longer falls within the GDPR’s limited household exemption. Whenever monitoring captures public space or third-party property, it triggers full compliance obligations: lawful basis under Article 6, transparency under Article 12, data minimisation, and above all, respect for the rights of data subjects under Articles 15 et seq. GDPR.

In the operative part of the Decision, each of the two individuals was fined a total of €3.000, comprising €2.000 for infringing the principles of lawfulness, purpose limitation, and accountability under Article 5 GDPR, and €1.000 for failing to comply with the data subject’s right of access under Article 15 GDPR.

But what about domestic stuff? Although the facts of the case centred on a neighbour, the ruling serves as a strong reminder for private individuals, who use surveillance tools to monitor baby sitters, cleaners, gardeners, or other domestic workers at their household. Even in one’s home, recording another person, particularly in the context of a work relationship, is considered data processing.

This means that any surveillance carried out within a household must have a clearly documented legal basis, such as freely given consent or a legitimate interest that can be properly justified. The monitoring must be proportionate to its purpose, limited in scope, and objectively necessary. The person being monitored must be informed in a transparent way, and their rights, including access and objection, must be fully respected. Any recordings must be securely stored, with access strictly controlled.

Crucially, when private individuals monitor third parties with whom they are contractually related, they are considered data controllers under Article 4 par. 7 GDPR. Simply being a private household does not exempt one from compliance.

If a nanny can be dismissed for breaching trust, then the same standard should apply to employers, who secretly monitor them without a valid legal basis and without informing them, as the law requires.

Unraveling Automated Decision-Making: Schufa’s Impact and Implications

On December 7 2023, the Court of Justice of the European Union (CJEU) delivered its judgment in the Schufa case, involving Schufa AG, Germany’s leading credit rating agency, holding data on nearly 70 million individuals.

Schufa provides credit scores that are relied upon by financial service providers, retailers, telecom companies, and utility firms. In a recent case, a German resident had their loan application rejected by a bank based on a credit score assigned by Schufa.

The individual contested this decision, seeking information about Schufa’s automated decision-making processes under Article 15(1)(h)  GDPR, which grants the right of access to such information.

Schufa argued that it was not responsible for the decision itself, asserting its role was limited to producing an automated score, leaving the actual decision to the third-party bank.

However, the court disagreed with Schufa’s stance. It held that the creation of the credit score is a relevant automated decision under Article 22 GDPR, challenging the belief that only the ultimate decision-maker, i.e. the bank, engages in automated decision-making.

The court rejected Schufa’s argument; It held that the creation of the credit score itself constitutes a relevant automated decision under Article 22 of the GDPR. In its judgment, the court considered the score’s “determining role” in the credit decision, adopting a broad interpretation of the term ‘decision.’

Companies employing algorithms for risk scores or similar outputs, such as identity verification and fraud detection, may be concerned about the potential impact of this judgment. Many businesses assume customers bear regulatory risks associated with decisions based on their outputs. However, careful consideration is necessary to distinguish business models from those in the Schufa case.

For example, companies should assess the extent to which customers rely on the provided output when making decisions. If the output is one of many factors considered, and especially if it holds moderate significance, exceptions to Article 22 GDPR (explicit consent or contractual necessity) should be explored.

Companies must further evaluate if the ultimate decision has a legal or comparatively significant effect. In cases where the decision’s impact is limited, exceptions under Article 22 GDPR may apply.

Schufa judgment coincides with the conclusion of the trilogue process around the EU AI Act, making it especially relevant for businesses developing AI-enabled solutions in high-risk areas, like credit decisions. The ruling is poised to influence practices in the evolving landscape of automated decision-making within 2024, as this remains an uncharted aread for the national and EU legislator.

 

Unlocking GDPR’s Synergy with AI: Insights from CNIL’s Guidance

The intersection of artificial intelligence (AI) and the General Data Protection Regulation (GDPR) has long been a subject of debate and concern. On one hand, AI presents remarkable advancements and transformative potential in various industries. On the other hand, GDPR places stringent demands on how personal data is collected, processed, and protected.

The question that arose early on is whether AI innovation and GDPR compliance may coexist harmoniously. In response to these complexities, the French data protection authority, CNIL, took a significant step by releasing official guidance that addresses the intricate relationship between artificial intelligence (AI) development and General Data Protection Regulation (GDPR) compliance. This guidance is a response to concerns raised by AI stakeholders during a call for contributions initiated on 28 July 2023.

CNIL’s primary aim is to reassure the industry by releasing a set of guidelines that emphasize the compatibility of AI system development with privacy considerations. In their own words, “[t]he development of AI systems is compatible with the challenges of privacy protection. Moreover, considering this imperative will lead to the emergence of devices, tools, and applications that are ethical and aligned with European values. It is under these conditions that citizens will place their trust in these technologies”.

The guidance comprises seven “How-to? sheets” providing valuable insights into applying core GDPR principles during the development phase of AI systems. Here are some key takeaways:

– Purpose Limitation: AI systems using personal data must be developed and used for specific, legitimate purposes. This means careful consideration of the AI system’s purpose before collecting or using personal data and avoiding overly generic descriptions. In cases where the purpose cannot be precisely determined at the development stage, a clear description of the type of system and its main possible functionalities is required.

– Data Minimization: Only essential personal data for the AI system’s purpose should be collected and used. Avoid unnecessary data collection, and implement measures to purge unneeded personal data, even for large databases.

– Data Retention: Extended data retention for training databases is allowed when justified by the legitimate purpose of AI systems. This provides flexibility to data controllers.

– Data Reuse: Reuse of databases, including publicly available data, is permissible for AI training, provided the data was collected lawfully and the purpose of reuse aligns with the initial purpose of data collection.

Additionally, CNIL’s guidance covers various other topics, including purpose defining, data protection impact assessment (DPIA), controllership determination, legal basis choice, and privacy by design.

This guidance serves as a valuable resource for businesses and organizations involved in AI systems, not only in France but also in any jurisdiction under the GDPR. It emphasizes that AI development and privacy can coexist with robust governance and content oversight.

Given that CNIL has announced two more guidance sets, AI stakeholders should stay vigilant for forthcoming directives to address evolving challenges in the AI landscape, particularly regarding personal data minimization and retention.

Additionally, as the dynamic landscape of AI and GDPR compliance is navigated, insights from other national data protection authorities are eagerly awaited. The ongoing dialogue revolves around striking the right equilibrium between innovation and data protection—a balancing act that holds the potential to benefit both progress and individual liberties.

Hellenic Data Protection Authority’s Take on Law 4624/2019

Under the threat of hefty financial sanctions, Greece enacted hastily Law 4624/2019 (“Greek GDPR Law”) last summer, in order to align the domestic data protection framework with the GDPR. The Greek GDPR Law also provided for specific rules on certain topics based on the GDPR’s broad opening clauses, permitting EU member states such as Greece to enact national legislation.

Following a period of uncertainty, the Hellenic Data Protection Authority (“HDPA”) published Opinion 1/2020, whereby they reviewed certain key or contested aspects of the Greek GDPR Law and provided much needed clarity on their compatibility with the Regulation.

In fact, by reiterating Commission’s guidance on the direct application of GDPR dated 24.01.2018, the HDPA stressed that when adapting their national legislation, Member States have to take into account the fact that any national measures which may create an obstacle to the direct applicability of GDPR and this way jeopardise its simultaneous and uniform application throughout EU are contrary to Union Law.

Repeating the text of regulations in national law, opined the HDPA, is also prohibited, unless such repetitions are strictly necessary for the sake of coherence and in order to make national laws comprehensible to those to whom they apply. In fact, reproducing the text of GDPR mot-à-mot in national specification law should be exceptional and justified, and cannot be used to add additional conditions or interpretations to the text of the regulation. This was not the case, however, with Greek GDPR Law, where several GDPR provisions were repeated verbatim and exceptions were introduced without any particular justification.

More particularly, HDPA pointed out that the interpretation of the Regulation should be left to the European courts (meaning the national courts and ultimately the European Court of Justice) and not to the Member States’ legislators. The national legislator can therefore neither copy the GDPR text when this is not necessary in the light of the criteria provided by the case law, nor interpret it or add additional conditions to the rules directly applicable under GDPR, said the Athority. If they did so, commercial entities throughout the Union would again be faced with fragmentation and would not know which rules they have to obey.

In view of the above, the HDPA noted that they shall not be applying Greek GDRP Law provisions, which: (a) are deemed not in line with GDPR, and/or (b) are not based on opening clauses, which make it possible for Member States to lay down specific national arrangements.

As regards personal data of employees, in particular, the HDPA clarified that the national legislator is not allowed to introduce new grounds for lawful processing other than those already set out in Art. 6 GDPR. In fact, processing under the GDPR framework can be lawful only on the basis of one of six specified conditions set out in Article 6(1)(a) to (f). Identifying the appropriate legal basis is of essential importance and controllers must take into account the impact on data subjects’ rights when identifying the appropriate lawful basis so as to fully respect the principle of fairness.

In this context, the Authority stressed that Art. 6 par. 1 (b) GDPR, which has been chosen by Greek legislator as the main processing legal ground, may sometimes be actually unfit in the employment environemnt. In fact, activities such as processing of biometric data, geolocation, monitoring of electronic media, whistleblowing policies ect. should be based on Art. 6 par. 1 (e) GDPR (processing necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller) or Art. 6 par. 1 (f) (processing necessary for the purposes of a legitimate interest) instead. This way, employees are able to challenge separate processing activities and perform their rights under GDPR, without the terms of their employment contract being challenged.

The matters handled with Opinion 1/2020 were not exhaustive and that is why HDPA explicitly reserved judgment on the compatibility of all other Greek GDPR Law provisions, which have not yet come under the spotlight.

As the case may be, it remains to be seen how Greek GDPR Law provisions shall be interpreted by Greek courts, once challenged by stakeholders, who are all those affected by the new rules (the business community and other organisations processing data, the public sector and citizens). The dust has not settlled yet, the winds of data regulation keep blowing strongly.

Air (Hera orders Aeolus to release the winds) (Aeneid I) by Charles Dupuis (1718)

Real-Time Bidding under the Sword of Damocles

On 20 May 2019, complaints were filed with the competent Data Protection Authorities in Spain, the Netherlands, Belgium, and Luxembourg, in connection with one of the latest digital marketing practices called Real-Time Bidding (“RTB”). The complainants consider RTB a “vast scale personal data leakage by Google and other major companies” in the behavoiral advertising industry.

A typical RTB transaction begins with a user visits a website. This triggers a bid request that includes various pieces of data, such as the user’s demographic information, browsing history, location, and the page being loaded. The request goes from the publisher to an ad exchange, which submits the request, along with the accompanying data to a bid manager. Advertisers automatically submit their bids in real time, in order to place their ads and the advertising space goes to the highest bidder, who displays the winning ad on the website. Real-time bidding transactions typically happen within 100 milliseconds (including receiving the bid request and serving the ad) from the moment the ad exchange received the request.

The criteria for bidding on particular types of consumers can be very complex. The complainants, nevertheless, point out tha there is no control over what happens to the data, a situation similar to the Facebook data leakage that enabled Cambridge Analytica to profile people, but for the fact that it is far greater in scale.

For example, Google relies on self-regulatory guidelines that rely on the companies that receive its broadcasts to inform it if they are breaking its rules. Google claims that over 2.000 companies are certified in this way. Google DoubleClick / Authorized Buyers sends, however, intimate personal information about virtually every single online person to these companies, billions of times a day.

It is relevantly reminded that in accordance with the applicable GDPR provisions, a company is not permitted to use personal data unless it tightly controls what happens to that data. In fact, Art. 5 (1)(f) GDPR requires that personal data be “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorized or unlawful processing and against accidental loss”.

Administrators of Facebook Fan Pages, Beware!

Have you set up a fan page on Facebook for your enterprize or to boost your fan base? The European Court of Justice has some news for you.

As you may know, administrators of Facebook fan pages can obtain anonymous statistical data on visitors to their fan pages via a function called “Facebook Insights”, which Facebook makes available to them free of charge under non-negotiable conditions of use. The data is collected by means of cookies, which are active for two years and are stored by Facebook on the hard disk of the computer or on another device of visitors to the fan page. The user code, which can be matched with the connection data of users registered on Facebook, is collected and processed every time the fan pages are opened.

By decision of 3 November 2011, a local German supervisory authority in Schleswig-Holstein, ordered Wirtschaftsakademie, a fan page administrator, to deactivate its fan page.

According to the supervising authority, neither Wirtschaftsakademie nor Facebook informed visitors to the fan page that Facebook, by means of cookies, collected personal data concerning them and then processed the data.

Wirtschaftsakademie brought an action against that decision before the German administrative courts, arguing that the processing of personal data by Facebook could not be attributed to it, and that it had not commissioned Facebook to process data that it controlled or was able to influence. Wirtschaftsakademie concluded that the Unabhängiges Landeszentrum should have acted directly against Facebook instead of against it.

It is in that context that the European Court of Justice was asked to interpret Directive 95/46 on data protection. The Court delivered Judgment in Case C-210/16, whereby it observed that an administrator such as Wirtschaftsakademie must be regarded as a controller jointly responsible, within the EU, with Facebook Ireland for the processing of that data.

Such an administrator takes part, by its definition of parameters (depending in particular on its target audience and the objectives of manging or promoting its own activities), in the determination of the purposes and means of processing the personal data of the visitors to its fan page. In particular, the Court noted that the administrator of the fan page can ask for demographic data (in anonymised form) – and thereby request the processing of that data – concerning its target audience (including trends in terms of age, sex, relationships and occupations), information on the lifestyles and centres of interests of the target audience (including information on the purchases and online purchasing habits of visitors to its page, and the categories of goods or services that appeal the most) and geographical data, telling the fan page administrator where to make special offers and organise events and more generally enabling it to target best the information it offers.

According to the Court, the fact that an administrator of a fan page uses the platform provided by Facebook in order to benefit from the associated services cannot exempt it from compliance with its obligations concerning the protection of personal data.

The Court further stated that, where the supervisory authority of a Member State (in this case, the German supervisor) intends to exercise with respect to an entity established in the territory of that Member State (in this case, Wirtschaftsakademie) the powers of intervention provided for in Directive 95/46, on the ground of infringements of the rules on the protection of personal data committed by a third party responsible for the processing of that data whose seat is in another Member State (in this case, Facebook Ireland), that supervisory authority is competent to assess, independently of the supervisory authority of the other Member State (Ireland), the lawfulness of such data processing and may exercise its powers of intervention with respect to the entity established in its territory without first calling on the supervisory authority of the other Member State to intervene.

The above judgment comes to reiterate that there must be no gaps in responsibility under data-protection law. This means specifically that all administrators of Facebook Pages have to ensure that they and Facebook conform to their respective obligations under data protection law.

Such a joint responsibility is particularly important with regard to a controller’s information obligations: transparency is required for the processing of data concerning all users, whether they are member of Facebook or non-members.

 

Bracing up for GDPR

With the new EU General Data Protection Regulation ante portas, companies handling personal customer data throughout the EU are set to face a considerable operational challenge. In fact, the GDPR extends compliance requirements to both data controllers and processors and is oriented towards establishing a modern and uniform data protection framework across the EU, reinforcing individuals’ rights and introducing a number of “data governance” concepts, especially in the area of data security.

Companies eager to move proactively and prepare well ahead of May 2018 need to proceed with a series of steps, such as mapping their data, conducting a due diligence review and implementing a robust response plan in case of data breach. The most important, they should redraft their privacy policies and nurture an inclusive personal data corporate culture, since conformity to GDPR is expected to be an ongoing obligation, demanding constant adjustments and a more hands-on approach.

A controller’s responsibilities summarily comprise:

  • Carrying out data protection impact assessments (PIAs) when the type of processing is “likely to result in a high risk to the rights and freedoms of natural persons”.
  • Assuring the effective protection of individuals, such as erasure, reporting and notice requirements, and maintaining records of processing activities.
  • Duties towards the competent regulatory authority, such as consultation prior to processing and data breach notification. To this purpose, the appointment of a Data Protection Officer, who would mediate as a contact point – or a breakwater – between the company and the authority could play a vital role in ensuring maximum compliance.

Finally, a processor of personal data is not exempt from the regulatory ambit of the GDPR and is therefore burdened with a number of responsibilities, including:

  • The pseudonymisation and encryption of the personal data processed on behalf of the controller,
  • the ability to ensure ongoing confidentiality, integrity and resilience of its processing systems and services,
  • the ability to restore access to personal data in a timely manner in the event of a malicious attack, or a physical incident, and
  • a process for regularly testing, assessing and evaluating the effectiveness of its technical and organizational systems, thus demonstrating that it puts every reasonable effort into safeguarding the security of processing.