European Union Reins in Big Tech

Οn Tuesday, 5 July 2022, the European Parliament held the final vote on the new Digital Services Act (DSA) and Digital Markets Act (DMA), two bills that aim to address the societal and economic effects of the tech industry by setting clear standards for how they operate and provide services in the EU, in line with the EU’s fundamental rights and values.

What is illegal offline, should be illegal online

The Digital Services Act (DSA) sets clear obligations for digital service providers, such as social media or marketplaces, to tackle the spread of illegal content, online disinformation and other societal risks. These requirements are proportionate to the size and risks platforms pose to society.

The new obligations include:

    • New measures to counter illegal content online and obligations for platforms to react quickly, while respecting fundamental rights, including the freedom of expression and data protection;
    • Strengthened traceability and checks on traders in online marketplaces to ensure products and services are safe; including efforts to perform random checks on whether illegal content resurfaces;
    • Increased transparency and accountability of platforms, for example by providing clear information on content moderation or the use of algorithms for recommending content (so-called recommender systems); users will be able to challenge content moderation decisions;
    • Bans on misleading practices and certain types of targeted advertising, such as those targeting children and ads based on sensitive data. The so-called “dark patterns” and misleading practices aimed at manipulating users’ choices will also be prohibited.

Very large online platforms and search engines (with 45 million or more monthly users), which present the highest risk, will have to comply with stricter obligations, enforced by the Commission. These include preventing systemic risks (such as the dissemination of illegal content, adverse effects on fundamental rights, on electoral processes and on gender-based violence or mental health) and being subject to independent audits. These platforms will also have to provide users with the choice to not receive recommendations based on profiling. They will also have to facilitate access to their data and algorithms to authorities and vetted researchers.

A list of “do’s” and “don’ts” for Gatekeepers

The Digital Markets Act (DMA) sets obligations for large online platforms acting as “gatekeepers” (platforms whose dominant online position make them hard for consumers to avoid) on the digital market to ensure a fairer business environment and more services for consumers.

To prevent unfair business practices, those designated as gatekeepers will have to:

    • allow third parties to inter-operate with their own services, meaning that smaller platforms will be able to request that dominant messaging platforms enable their users to exchange messages, send voice messages or files across messaging apps. This will give users greater choice and avoid the so-called “lock-in” effect where they are restricted to one app or platform;
    • allow business users to access the data they generate in the gatekeeper’s platform, to promote their own offers and conclude contracts with their customers outside the gatekeeper’s platforms.

Gatekeepers can no longer:

    • Rank their own services or products more favourably (self-preferencing) than other third parties on their platforms;
    • Prevent users from easily un-installing any pre-loaded software or apps, or using third-party applications and app stores;
    • Process users’ personal data for targeted advertising, unless consent is explicitly granted.
Sanctions

To ensure that the new rules on the DMA are properly implemented and in line with the dynamic digital sector, the Commission can carry out market investigations. If a gatekeeper does not comply with the rules, the Commission can impose fines of up to 10% of its total worldwide turnover in the preceding financial year, or up to 20% in case of repeated non-compliance.

Next Steps

Once formally adopted by the Council in July (DMA) and September (DSA), both acts will be published in the EU Official Journal and enter into force twenty days after publication.

The DSA will be directly applicable across the EU and will apply fifteen months or from 1 January 2024 (whichever comes later) after the entry into force. As regards the obligations for very large online platforms and very large online search engines, the DSA will apply earlier – four months after they have been designated as such by the Commission.

The DMA will start to apply six months following its entry into force. The gatekeepers will have a maximum of six months after they have been designated to comply with the new obligations.

Source: European Parliament

A New Deal for Consumers

On 8 November 2019, the European Parliament and the Council adopted a directive on the better enforcement and modernisation of EU consumer protection rules. The directive is a part of the so-called “New Deal for Consumers” legislative package proposed by the European Commission in April last year. The directive, which the Member States will have 24 months to implement into their national legislation, is bound to bring about many significant changes, especially for businesses trading online. The most notable updates are briefly set out below.

Online Marketplaces

In today’s online intermediation services (marketplaces), the trading coordinates of the actual seller is not always clear to the end-consumer. This has been identified as an issue, since consumer protection rules do not apply to C2C (consumer to consumer) relationships, and a consumer could unknowingly purchase products from another private individual through a marketplace. The new legislation introduces transparency as regards whom the consumer is entering into an agreement with.

That is, when buying from an online market place, consumers will have to be clearly informed about whether they are buying goods or services from a trader or from a private person, so they know what protection they will benefit from if something goes wrong. Moreover, when searching online, consumers must be clearly informed when a search result is being paid for by a third-party trader or not. They will also be informed about the main parameters determining the ranking of search results and who they can turn to when something goes wrong.

Personalised Pricing

Transparency will be further required with respect to personalised pricing. The new legislation mandates that consumers be clearly informed when the price presented to them is based on personalisation on the basis of automated decision-making. There should be noted, here, that GDPR restricts the use of automated decision-making, which may also impact the use of personalised pricing.

Consumer Protection for “Free” Services

There is no denying the fact that data may often replace monetary payment when using online services such as social media, cloud services, and email services. To bolster consumer protection for such “free” services, the directive now requires that the fourteen (14) day withdrawal right be applicable to digital services will also apply to such “free” services.

Clear Information on Price Reductions

In order to address misleading price information, the new directive dictates that any announcement of a price reduction must indicate the prior price applied by the trader. The prior price means the lowest price applied by the trader during a period of time not shorter than 30 days prior to the application of the price reduction.

New penalties for Violations

Aiming to reimburse consumer protection, the new directive grants the national legislator the right to impose a fine of up to 4% of the trader’s turnover for violations that are widespread and affect consumers in several Member States. This follows the same pattern with personal data protection, where the GDPR introduced similar fines for violations. This pattern has proved successful, as many enterprizes have proceeded with substantial investments to enhance data protection. It is therefore expected that businesses shall now need to turn their attention to furhter enhancing their compliance with consumer protection legislation.

The directive is only one of the two directives making up the New Deal for Consumers legislative package. The second directive on representative actions for the protection of the collective interests of consumers would empower certain qualified entities, such as consumer organisations, to launch representative actions seeking injunctions and collective redress (e.g. compensation, replacement, or repair) on behalf of a group of consumers. This directive is still making its way through the legislative process.

Cookies should come with a consent

On October 1, 2019, the Court of Justice of the European Union (CJEU) ruled that storing cookies on an Internet user’s computer requires active consent. Consent cannot be implied or assumed and therefore a pre-ticked checkbox is insufficient (the press release can be found here).

The CJEU ruling stems from a 2013 case, in which the German Federation of Consumer Organizations (GFCO) took legal action against online lottery company Planet49. Planet49’s website actually required customers to consent to the storage of cookies in order to participate in a promotional lottery; as part of entering the lottery, participants were presented with two separate checkboxes: The first one was an unticked marketing checkbox, in case the user wished to be receiving third-party advertising. The second one, though, was a pre-ticked box allowing Planet49 to set cookies to track the user’s behavior online. The GFCO argued that this practice was illegal, since the authorization to set cookies did not involve explicit consent from the user.

In fact, the CJEU agreed with the GFCO in its finding that Planet49 is required to obtain active consent from its users, such consent not being possible in the form of a pre-selected checkbox. This active consent, ruled the Court, is required without any further differentiation, in particular, between strictly necessary cookies, reach measurement cookies or tracking cookies; the CJEU adopts this way the view that the cookie consent requirement applies regardless of whether or not the information accessed through the cookie is personal data within the definition of the GDPR.

Furhtermore, according to the CJEU it would “appear impossible” to objectively ascertain whether a user has provided informed consent by not deselecting a pre-ticked check-box, as the user may simply have not noticed the checkbox, or read its accompanying information before continuing with his or her activity on the website. Further to that, the CJEU held that active consent is expressly set out in GDPR, where recital 32 expressly precludes “silence, pre-ticked boxes or inactivity” from constituting consent.

In view of the above reasonings, it seems that consent obtained for placing cookies with the help of pre-ticked boxes, or through inaction or action without intent to give consent, even prior to the GDPR entering into force, has been unlawfully obtained. So it now remains to be seen if any action by supervisory authorities shall ensue, to tackle some of those data collection practices relying on unlawfully obtained consent.

As the case may be, following years of disparate approaches by national transposition laws and supervisory authorities, the ruling in Planet49 has introduced a much needed clarity on how the “cookie banner” and “cookie consent” provisions in the ePrivacy Directive should be applied.

In this regard, the Planet49 case is likely to have an impact on the ePrivacy regulation ongoing negotiations, which is set to regulate cookie usage in the not-so-distant future. Until this time arrives, website owners wishing to avoid any “kitchen accidents” would be well advised to request cookie consent for all cookies other than cookies that are technically required to properly operate their website. That is, marketing, tracking, and analytics cookies may only be used with explicit, clear, informed and prior consent, provided by means of a consent management tool.

Real-Time Bidding under the Sword of Damocles

On 20 May 2019, complaints were filed with the competent Data Protection Authorities in Spain, the Netherlands, Belgium, and Luxembourg, in connection with one of the latest digital marketing practices called Real-Time Bidding (“RTB”). The complainants consider RTB a “vast scale personal data leakage by Google and other major companies” in the behavoiral advertising industry.

A typical RTB transaction begins with a user visits a website. This triggers a bid request that includes various pieces of data, such as the user’s demographic information, browsing history, location, and the page being loaded. The request goes from the publisher to an ad exchange, which submits the request, along with the accompanying data to a bid manager. Advertisers automatically submit their bids in real time, in order to place their ads and the advertising space goes to the highest bidder, who displays the winning ad on the website. Real-time bidding transactions typically happen within 100 milliseconds (including receiving the bid request and serving the ad) from the moment the ad exchange received the request.

The criteria for bidding on particular types of consumers can be very complex. The complainants, nevertheless, point out tha there is no control over what happens to the data, a situation similar to the Facebook data leakage that enabled Cambridge Analytica to profile people, but for the fact that it is far greater in scale.

For example, Google relies on self-regulatory guidelines that rely on the companies that receive its broadcasts to inform it if they are breaking its rules. Google claims that over 2.000 companies are certified in this way. Google DoubleClick / Authorized Buyers sends, however, intimate personal information about virtually every single online person to these companies, billions of times a day.

It is relevantly reminded that in accordance with the applicable GDPR provisions, a company is not permitted to use personal data unless it tightly controls what happens to that data. In fact, Art. 5 (1)(f) GDPR requires that personal data be “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorized or unlawful processing and against accidental loss”.