×

EU Policy Update - October 2019

EU Policy Updates 07-11-2019

In a nutshell: The European Parliament is still working on its position on the e-Evidence package, while the upcoming Commissioner for Justice promises to prioritise EU internal discussions over the Transatlantic deal on the same matter. Europol issued its annual Internet Organised Crime Threat Assessment report. The European Commission found inconsistencies with identifying the “operators of essential services” under the NIS Directive, including the diverging methodology within the TLD sector. The European Parliament adopted a resolution on electoral interference and disinformation. Member States submitted their comments on the challenges with the GDPR implementation. The European Data Protection Board issued guidelines on Article 6 of the GDPR in the context of online services. The European Court of Justice issued its rulings in the case of Google and its obligation to respect the ‘right to be forgotten’, and in the case of Facebook which can be ordered to monitor and remove identical and ‘equivalent’ illegal content worldwide in order to comply with the national court order.

e-Evidence

Commissioner-designate for Justice promises to prioritise e-Evidence in the EU over the Transatlantic deal

On 2 October, the Commissioner-designate for Justice, Didier Reynders (Belgium), was questioned by the Members of the European Parliament on his suitability for the post of Commissioner, as well as on the specific competence for the portfolio allocated to him by the President-elect of the European Commission, Ursula von der Leyen. According to the mission letter issued to Reynders by von der Leyen, the Commissioner for Justice will be responsible for inter alia "strengthening consumer protection for cross-border and online transactions, facilitating and improving judicial cooperation, as well as implementing new digital technologies for justice systems." By default, the specific file that fits into Reynders' portfolio is also the ongoing legislative reform on cross-border access to electronic evidence (e-Evidence). During the opening remarks of his hearing, Reynders did not mention anything in relation to e-Evidence. However, in response to the specific question by MEP Marina Kaljurand (S&D) on how to ensure that European data protection standards are not undermined in international negotiations (e.g. the EU-US deal on e-evidence), Reynders specified that he does not intend to conclude negotiations with the US before the internal approach within the EU is agreed upon. At the same time, the extraterritorial impact of the US CLOUD Act also needs to be taken into consideration.

Cybersecurity

Europol issues its annual Internet Organised Crime Threat Assessment report

According to Europol's annual Internet Organised Crime Threat Assessment (IOCTA) report, ransomware remains the top threat this year. Other priority threats include Business Email Compromise (BEC) scams that trick employees into making corporate transfers to the perpetrators, and the exploitation of online service providers by terrorist groups to distribute their message. The IOCTA 2019 also continues to stress the need for law enforcement (LE) to access electronic data in order to conduct their investigations, underlining the "legislative barriers" that e.g. GDPR and the overturning of the Data Retention Directive in 2014 have brought to the accessibility of WHOIS data for LE. IOCTA also references the warning issued by ICANN in early 2019 that warns about "ongoing and significant risk to key parts of the Domain Name System (DNS) infrastructure" that relates to attacks with the potential to data transit, redirect traffic or allow attackers to spoof specific websites. IOCTA predicts that as a result, more attacks on the DNS will either come to light or new incidents will occur. Additionally, IOCTA quotes the results of the survey conducted by ICANN in 2018 that measured the impact of the unavailability of WHOIS data post-GDPR, concluding that only 33 % of respondents indicated that WHOIS met their investigative needs, compared to pre-GDPR results of 98 %.

European Commission publishes its report on identifying "operators of essential services" across Member States

On 28 October the European Commission published its long-due report on assessing the consistency of the approaches taken by Member States in the identification of operators of essential services (OES) under the NIS Directive. The report concludes that "there are diverging interpretations by Member States as to what constitutes an essential service under the NIS Directive". In addition, "the scope of the Directive risks being fragmented, with some operators being exposed to additional regulation[...]". Some countries have also identified essential services in additional sectors beyond those covered by the Directive. This, according to the report, highlights that there are other sectors potentially vulnerable to cyber-incidents beyond what is considered by the NIS Directive. The report is based on the data submitted by 23 Member States, and the Commission urges the remaining countries to complete the process of identification as soon as possible. Member States are also encouraged to engage with each other more actively to eliminate the inconsistencies in applying different thresholds for OES. In the sphere of OES within the digital infrastructure, the report gives a breakdown of thresholds used in identifying OES for DNS providers and top-level domain registries. According to the report, some Member States have only identified ccTLDs as OES (EE, FI, HR, IE), while others base it on the number of domains (AT, DK, PL, SE) or the number of queries per day (UK, MT).

Disinformation

European Parliament adopts a resolution on electoral interference and disinformation

On 10 October the European Parliament adopted a resolution on foreign electoral interference and disinformation in national and European democratic processes, with 469 votes for, 143 against and 47 abstentions. In its resolution, the Parliament acknowledged the "positive impact of the voluntary actions taken by service providers and platforms to counter disinformation" and included a reminder that platforms, the Commission and the Member States have a joint responsibility "when it comes to the fight against disinformation". The resolution also expressed concern about "EU dependence on foreign technologies and hardware" and underlined the need for the EU "to strive to increase its own capabilities". Furthermore, the European Parliament called on the Commission "to classify electoral equipment as critical infrastructure" under the NIS Directive, and called on "the next Vice-President of the Commission[...] to make the fight against disinformation a central foreign policy objective". Lastly, the Parliament called on the Commission "to evaluate possible legislative and non-legislative actions" to instruct social media platforms to systematically label content shared by bots and to close down accounts of "persons engaging in illegal activities aimed at the disruption of democratic processes or at instigating hate speech".

UN Special Rapporteur on the freedom of expression releases a report on governing online hate speech

On 21 October, David Kaye, the UN Special Rapporteur on the promotion and protection of the freedom of opinion and expression, presented his report on governing online hate speech. In the report, Kaye first analyses the balance between the prohibition of the "advocacy of hatred that constitutes incitement that will likely result in discrimination, hostility or violence" (i.e. 'hate speech') and legitimate speech that might offend or disturb, which is protected under freedom of expression. Kaye also reiterates that the prohibition of hate speech does not mean that states are obliged to criminalise such behaviour under international human rights law. While any offensive speech that attacks or uses pejorative or discriminatory language aimed at a person or a group should be combatted by states and companies with, e.g. education and condemnation, specific legal restrictions derived from the prohibition of hate speech need to meet the "strict standards of international human rights law". When it comes to online hate speech, Kaye states that "States should not use internet companies as tools to limit expression that they themselves would be precluded from limiting". According to Kaye, "legislative efforts to incentivise the removal of online hate and impose liability on internet companies for a failure to do so must meet the necessity and proportionality standards". In other words, when states try to outsource decision-making over hateful content to private companies, these decisions cannot be taken on the basis of overly-broad and blanket definitions that will eventually result in over-blocking or the take-down of lawful, albeit offensive, content. That does not mean that companies have no human rights responsibilities. According to Kaye, companies should regularly review their policies, be transparent about the implementation of these policies, develop comprehensive understanding of abuse on their platforms through community and expert engagement and "firmly counter such incitement". Furthermore, if companies are serious about protecting human rights on their platforms, they "must require human evaluation" when relying on AI.

Data protection and privacy

Member States submitted their comments on the GDPR implementation

In preparation of the Council of the EU’s position on the application of the GDPR, several Member States have submitted their comments following the call made by the Finnish presidency. The submitted feedback identifies several areas where more clarity and guidance from the European Commission and the European Data Protection Board (EDPB) is requested by Member States. Germany, for example, highlighted the need to improve the effectiveness and operability of Article 25 and the principle of 'privacy by design and by default'. According to Germany this can be improved by "providing practical guides for interpreting the technical safeguards for data security and for privacy by default". Furthermore, specific design requirements for data processing within AI and on platforms should be considered, according to Germany. France highlighted the fact that several private archives that are operating for public interest might fall out of the scope of the exceptions reserved for data processing for archiving purposes (Article 89). The Netherlands and Poland raised questions on possible tensions between the GDPR and new technologies, such as blockchain. Both countries stressed the need to evaluate how the GDPR can support the use and further development of blockchain, while still allowing data subjects to exercise their rights.

European Data Protection Board issues guidelines on Article 6 of the GDPR in the context of online services

The European Data Protection Board (EDPB) published its guidelines for lawful processing of personal data “for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract" (Article 6(1)(b) of the GDPR) when it comes to "online services". The "online services" referred to in the EDPB guidelines are meant to cover "information society services" under EU law. The EDPB clarified that "the lawful basis for processing on the basis of Article 6(1)(b) needs to be considered in the context of the GDPR as a whole", and in particular to the principles of purpose limitation and data minimisation that are especially relevant for online contracts. Where processing is not considered ‘necessary for the performance of a contract’, the EDPB recognised that another lawful basis may be applicable (e.g. freely given consent). Any additional processing beyond the necessity for the performance of a contract can only take place if it relies on another appropriate legal basis, according to the EDPB. When online service providers collect detailed information on how users engage with their service and try to justify this type of data collection with a notion of "service improvement", then this type of data processing cannot typically be considered lawful under Article 6(1)(b). Online service providers should seek another legal basis to justify such action. Similarly, any monitoring or profiling of users for fraud prevention purposes "is likely to go beyond what is objectively necessary for the performance of a contract with a data subject". However, any data processing that "is strictly necessary to prevent fraud may constitute a legitimate interest of the data controller and could thus be considered lawful[...]", according to the EDPB.

ECJ: Google does not need to de-reference search results worldwide to comply with a 'right to be forgotten'

On 24 September the European Court of Justice (ECJ) delivered its judgment in the case of Google v CNIL, where it found that there is no obligation under EU law for a search engine operator, following an injunction from a supervisory or judicial authority of a Member State, to carry out a de-referencing on all the versions of its search engine in order to comply with a 'right to be forgotten' request (read to apply the 'right to be forgotten' globally). However, the mere fact that EU data protection rules are governed by a Regulation (i.e. the GDPR) suggests that a de-referencing request that originates from one Member State should be carried out in respect to all Member States. However, the global application of a de-referencing request is also not prohibited under EU law, and the supervisory or judicial authority of a Member State remains competent to weigh up fundamental rights such as the right to privacy of an individual and the freedom of information according to their national standards, and to order the search engine to carry out a global de-referencing in a particular case.

Content moderation

ECJ: Facebook can be ordered to monitor and remove or block access to illegal content worldwide

This case concerns defamatory comments made on Facebook regarding an Austrian politician. The referring court in Austria asked the ECJ whether a social media platform like Facebook can be asked to seek and identify statements with identical wording/equivalent content across all of its platform worldwide as part of an injunction in a concrete case. According to the judgment of the ECJ, Facebook, as a hosting service provider under the e-Commerce Directive, can be ordered by a national court "to remove information which it stores, the content of which is identical to the content of information which was previously declared to be unlawful" or "to block access to that information, irrespective of who requested the storage of that information", meaning that Facebook can be obliged to monitor its service for identical illegal content with an aim to remove it. When it comes to "equivalent content", provided that the conveying message of such content remains essentially unchanged and therefore diverges very little from the original illegal content, the ECJ ruled that a hosting service provider can be ordered to remove this type of content too in order to "bring an end to an illegal act and to prevent it being repeated". Thirdly, nothing in EU law precludes a national court from ordering a hosting service provider "to remove information covered by the injunction or to block access to that information worldwide".

Published By Polina Malaja
Polina Malaja is the Policy Director at CENTR, leading its policy work and liaising with governments, institutions and other organisations in the internet ecosystem.