Visegrad Four want to distinguish between ‘illegal’ and ‘harmful’ content in Digital Services Act

2020-11-10

According to the Visegrad Four countries, the Commission’s Digital Services Act is necessary but Europe must avoid censorship and any other forms of violation of the right to freedom of expression. One of the most pressing topics seems to be the removal of illegal and harmful content from social platforms.

By Lukáš Hendrych (EURACTIV.cz), Lucia Yar (EURACTIV.sk), Patrik Szicherle (Political Capital), Michał Strzałkowski (EURACTIV.pl)

Digital revolution has brought lots of benefits and useful technologies. On the other hand, with modern and connected world also come many serious challenges. As we can see, internet companies, technologies and online platforms have become powerful and mighty influencers of whole societies and nations.

The influence can be positive, but also negative. The latter is expressed e.g. by spreading illegal content such as hate crime, child pornography or terrorist content, or harmful content such as fake news and disinformation. On top of that, internet giants make a profit by collecting and using the personal data of their users, having a huge impact on the global economy, but also on the political environment and elections due to targeted advertising, non-transparent algorithms or psychological profiling of the users.

However, the main problem is that the current legal framework is not adapted to such reality. The 20-years-old eCommerce directive – that is still in force – is outdated and desperately needs to be reformed. The European Commission perceive the need, nevertheless. It plans to propose revolutionary and complex legislation called Digital Services Act (DSA) in December. The package will cover e.g. content and user moderation, new competition rules for so-called gatekeeper platforms, fight against illegal and counterfeit sellers as well as new rules spanning online safety, liability, market dominance or online advertising.

As we can see, ambitions are huge. Even though the DSA is still being worked on, the discussions during this pre-legislative phase have been going on for months already. And the Visegrad countries are not an exception. The message is clear – regulation is necessary but Europe has to avoid censorship and any other forms of violation of the right to freedom of expression.

The Czech Republic

In Czechia, politicians, experts, and other stakeholders sees the need of the reform of legal rules for internet companies and online platforms. The most pressing topic seems to be possible removal of illegal and harmful content from social platforms. On the other hand, Czechs are quite careful when it comes to imposing new legal duties by the future DSA, especially as far as possible liability for the uploaded content is concerned.

“We need to define what the obligations of digital services are, and also make sure that there is a clear legal framework for how exactly platforms should became aware of illegal content - without removing it by monitoring users’ content. This is the so-called notice and action mechanism,” Czech MEP and Vice-president of the EP Marcel Kolaja (Greens/EFA) told the EURACTIV.cz.

According to him, the DSA should ensure swift removal of illegal content, however, automated tools proved to have many unintended side effects.

The Czech government’s stance concerning obligation to remove illegal content is quite clear.

“When it comes to liability of platforms, the Czech Republic is of the opinion that the current liability exemption for intermediary service providers should be maintained. However, we do believe that time has come to review provisions relating to procedural aspects of removal of illegal content from the internet. From this perspective, platforms should have an obligation to observe a set of rules when tackling hosted illegal content,” the Head of Information Society Services Unit from the Ministry of Industry and Trade, Patrik Tovaryš, stressed.

Stakeholders agree that it is crucial to distinguish between illegal and harmful content.

“There is an absolute need to distinguish between the notion of illegal content and legal but harmful content. This is imperative as there usually is a distinction between criminal conduct such as sharing terrorist content and sharing disinformation, which many users share with faith it is true,” Patrik Tovaryš pointed out.

David Nosák from the Center for Democracy and Technology shares this view adding that “the e-Commerce Directive prohibits Member States from imposing obligations on digital services to monitor the content they transmit or store, which is an important principle that should be upheld”.

In his opinion, the legal framework could benefit from further clarification about dealing with illegal content and provide legal certainty about moderating lawful content, whether it is disinformation or hateful or harassing speech that does not amount to a violation of law.

“Intermediaries should not face liability for failing to remove illegal content by users unless the notice is supported by a court order or similarly independent adjudication. To allow otherwise would force intermediaries to determine illegality on their own, which they are not well-equipped to do and which will inevitably cause them to over-censor speech in order to avoid liability risk,” he added.

Possible censorship – even unintentional – is inadmissible also for Czech MEP and Vice-president of the EP Dita Charanzová (RE). According to her, the danger is related mainly to upload filters.

“This is why mandatory filters are not acceptable. There must be legal wiggle room for platforms to give users the benefit of the doubt as to whether a piece of content or even a piece of the piece of content is legal or not. Mandatory system would be so powerful that you would get far more unintentional censorship,” she thinks.

Jana Břeská from the Association for Internet Progress is even more sceptical when it comes to removing content.

“If the platform had a legal duty to evaluate and delete illegal and harmful content, it might affect freedom of expression. Harmful content can be very subjective and therefore the risk of limiting freedom of speech is higher. There might be some automated measures to prevent upload of illegal content that is defined by law, however a close cooperation with public authorities is essential,” she said.

On top of that, she also pointed out danger of regulating advertising on the internet which should be one of the topics covered by DSA, too.

“Use of personal data - especially for targeted advertising and content - should not be regulated by DSA, it is already sufficiently covered by GDPR. Commercial advertising for business purposes and profiling are clearly regulated by GDPR and will be covered also by ePrivacy regulation. Further regulation is not needed and would put at risk existence of many EU media and content providers whose main financial source is online advertising,” she warned.

Slovakia

In Slovakia, the discussion on DSA is generally limited to a small group of experts and stakeholders, and its space has not yet opened up to the larger public discussion in media. Although Slovakia expects the DSA to become an essential tool for Member States to enforce a platform response, it is more lenient towards these companies.

In general, Slovakia supports the principle "what is illegal offline should also be illegal online", while suggesting that the rule should apply not only to illegal content but also to goods and services offered on internet, as those should be a subject to the same regulations and standards as goods and services sold in stone shops.

MEP Vladimír Bilčík (EPP), the only Slovak member of the newly created European Parliament's Committee on Social Networks, thinks that “there should be an effective way to alert platforms and force the rapid deleting of content that is illegal, although in some cases, such content may be in accordance with their current rules."

According to the Slovak government, the issue of the responsibility of online platforms for content, including a question of users uploads, is an important part of the legislation-in-making. Slovakia advocates "maintaining the principle of limited liability, as well as the prohibition of the obligation of general monitoring", the Ministry of Investment, Regional Development and Informatization claims. Moreover, the executive branch does not think that platforms should be responsible for illegal and harmful content, placed there by their users, of which they are unaware.

However, according to the Slovakia, the DSA should define the conditions and entities to which this principle may apply more precisely as well as to clearly determine terms such as “illegal” and “harmful” content or “goods and services” sold online. According to the Ministry of Informatization, the 2016 Code of Conduct on Counter of Illegal Online Hate Speech could be extended by the DSA to more categories of illegal content.

On the issue of illegal content, Slovakia expects the new legislation to define and put in practice a "single and effective mechanism ´notice and action´”, so that the Union does not have to oblige platforms to monitor their networks in general. According to the Deputy Minister Marek Antál, responsible for digital related matter of the Ministry, companies "must act swiftly and effectively if they are informed of illegal and harmful content by the competent authorities of the Member States".

Representatives of companies also agree. According to the Slovak Alliance for Innovation Economy (SAPIE), a non-profit platform with more than 100 members, if platforms become content editors, it could also be reflected in the widespread deletion of posts through algorithms that would negatively affect users, including not only the population, but also businesses that use them. The Alliance points out that strictly preventive setting of deletion algorithms is the least costly, yet it can create the most damage. In connection to the price of setting up services that would have to check content quickly, yet widely, SAPIE warns against the negative impact that this decision would have on small businesses who cannot afford a highly sensitive tools.

MEP Bilčík stresses that the speed of removal as well as the same content moderation services must be unified across the Member States of the EU.

As far as political advertising is concerned, Bratislava is aware that the topic of regulating political advertising is a sensitive area. However, the DSA could, according to the Ministry of Innovation, contribute to defining clear rules, according to which platforms will transparently inform users about their clients, funding, as well as the targeting of political advertising. Slovakia considers the creation of publicly available databases of political advertising on the EU level to be one of the ways to ensure such increased transparency.

MEP Bilčík also refers to clarity: "We must have detailed data on political advertising, coming not only from the official accounts of political parties," he told EURACTIV.sk. According to him, especially in the pre-election period, data on advertising on political topics should be reachable to the competent authorities also from other accounts.

Here, companies differ from the government a bit. They consider minimal entry by the DSA to be the right way to go. "There are preconditions for the tendentious application of these principles based on the political preferences of the implementers of such regulation or of the operators of platforms," Michal Kardoš, Executive Director of SAPIE explains.

Hungary

Even though the consultation window for the Digital Services Act (DSA) is now over, the upcoming legislative proposal and its declared goals have not become a prevalent part of the Hungarian public discourse yet. However, separately from the DSA, regulating social media giants has been in the focus of attention after the Hungarian cabinet launched a Digital Freedom Working Group (DFWG) aiming to examine the operation of tech companies from the perspective of fundamental rights and the rule of law.

A few months after the DFWG was launched, the head of the National Data Protection and Information Authority (NAIH) Attila Péterfalvi said the government should implement a regulation on the operation of social media sites because Facebook influences countries’ “constitutional rules, court practices, constitutional culture” externally by banning “certain opinions.” Opposition parties harshly criticized his statement, fearing that the government wants to “silence” critics on social media as well. Minister of Justice Judit Varga said the government does not want to introduce censorship on Facebook, it wants to know what legal guarantees Facebook uses to censor Hungarian citizens.

According to Anna Donáth, an MEP sitting in the Renew Europe Group for the liberal Momentum party in Hungary, regulation on these issues should be enacted on the European level. She says the EU must create unified criteria and a joint regulatory framework, one solution for which could be an independent EU institution observing compliance with rules on transparency, content moderation and political advertisements.  

Regarding content regulation, Dr. Emese Pásztor, the director of the Political Freedoms Project at the Hungarian Civil Liberties Union (TASZ) highlighted to Political Capital that “the freedom of speech online is in itself a value that needs to be protected.” She noted that it is courts who decide what counts as “illegal content” and if the operators of sites are responsible for contents under civil law, they would become cautious, leading to the removal of any content that might even have a minimal chance of breaking the law.

Gábor Polyák, head of Mérték Media Monitoring and the head of the Communications and Media Science Department the Pécs University of Science, noted to Political Capital that processes not involving the deletion of contents might have adverse effects, as shown by Facebook’s experiences: flagging contents only lead to even more clicks. The research institute of the Media Council of the National Media and Infocommunications Authority (NMHH), the Institute of Media Studies (IMS) explained its own opinion based on their previous research and scientific works to Political Capital. Their opinion is somewhat different from Gábor Polyák’s: they stated that the reports of the European Regulators Group for Audiovisual Media Services (ERGA), preliminary feedback from fact-checking organizations and the platforms on content regulation suggest “removing the given contents should rather be restricted to the clear-cut cases, the general practice might move towards flagging the questionable contents.”   

The IMS noted that under current EU and Hungarian regulations, web hosting and online search service providers are not responsible for the contents shared on them as long as they have no information that they might breach the law. If such a service provider gains knowledge of illegal activities on the platform, they immediately have to take steps to remove the problematic content, which is the so-called notice-and-takedown procedure. Gábor Polyák said that notice-and-takedown procedures are adequate for removing any sort of illegal contents, including false medicine advertisements. However, it is a question whether the EU will find the method sufficient in the future; the bloc might expect algorithmic filtering, which leads on a “dangerous path.”   

According to Dr. Pásztor, removing content must be a last resort based on transparent rules. Other tools, such as warning the author of the content in question, flagging contents or improving media literacy skills could prove effective in ensuring that internet users receive credible information. 

MEP Anna Donáth told Political Capital that online platforms must play a role in the fight against disinformation, but they must do so transparently in terms of decision-making and algorithms, and there must be a body to turn to with legal remedies. The Institute for Media Studies is of the opinion that the relationship between the freedom of speech and disinformation “poses a challenge to all legislators, both on the national and EU level.” Their view is that non-factual statements are a part of discourses, and “as such they cannot be excluded from the scope of the freedom of speech simply due to their lack of factuality,” so their presence in the public discourse can only be restricted based on strict criteria.

Overall, Hungarian actors generally caution against removing a too wide range of contents online, favouring an approach based on multiple methods (e.g., media literacy training, flagging, etc.), transparency from the side of online platforms, and an opportunity to seek legal remedies against content moderation decisions.

Poland

Work on the DSA has been closely observed in Poland by non-governmental organizations dealing with the right to privacy in a digitalized society or civil liberties in the Internet, such as the Panoptykon Foundation and the Digital Center Foundation (Fundacja Centrum Cyfrowe).

They recommend to not treat the Internet platforms as the so-called hosting providers anymore, because their business model has changed a long time ago. They also recognize that the self-regulation model has failed.

In their opinion, it is necessary to legally define the entities on the digital market, especially to specify which entities are considered to be dominant. And this should be the subject of additional obligations imposed on them.

Regarding an issue of content moderation, NGOs point out the fact that this matter would be partially solved by transferring control over its rules to the community or public trust entities.

They postulate to keep platforms responsible for infringing content, but the basis must be actual knowledge of the infringement and a clear notification from the user or authorized body.

"We are unquestionably in favour of not requiring platforms to monitor user content. Exclusion of responsibility for content cannot, however, mean exemption from the responsibility for own actions of Internet platforms, e.g. within the framework of content moderation policy," emphasized the president of Panoptykon Foundation Katarzyna Szymielewicz.

It is also about regulating only prohibited content as defined by law, and not giving Internet platforms the possibility to restrict freedom of speech. According to NGOs, the system of notification of illegal content must be transparent and efficient, and users must have a clear appeal procedure available.

Polish MEP Róża Thun (Civic Coalition, EPP) argues that "where platforms actively manage content and promote certain controversial or polarized content, they should be more responsible than before, because this leads to certain social impacts. Nor can it be that some companies make money from inciting aggression or hate and spreading false information online," she said.

In turn, the chair of European Tech Alliance-EUTA, which includes the largest Polish online retail platform Allegro, Magdalena Piech pointed out that the order of total and top-down monitoring of content posted by users of services may lead to the opposite effect than intended and by "falsely positive" verification of some content as illegal in reality limit users' rights while generating also high costs for these platforms.

"The principle limited liability would be better, which would allow platforms to develop and at the same time prevent violations of fundamental rights. If the platforms were to become automatically responsible for the content they host, they would likely take an overly prudent approach," said EUTA chair.

The NGOs also advocate that European law should guarantee full transparency of targeting and accountability of how automatic decision making works.

"Additional obligations should be introduced in relation to Internet platforms - and not only larger ones or those that are considered dominant - in the sphere of audit, human oversight of transparency and due process (i.e. guaranteeing fair procedure and the right to human intervention) when the platform uses algorithms to moderate, filter or select content displayed to users", indicated the president of the Digital Center Foundation Alek Tarkowski.

MEP Róża Thun has submitted several dozen amendments to the DSA, many of which relate to this issue. "I suggested that targeted advertisements should be more strictly regulated and in some cases prohibited. Behavioural advertising based on certain characteristics, i.e. exposing mental or physical problems, should not be allowed at all. In other situations, if users are to receive targeted advertising, they should be aware of this and give prior consent. The issue of artificial intelligence is very widely discussed in the report. It is very important that its use does not lead to discrimination”, said the Polish MEP.

The discussion on the DSA also raises the issue of the platforms' responsibility for each product offered by their users, and above all for whether it is not counterfeit, compliant with standards or the law in force.

"It is different to be responsible for the content posted by users – hate-speech, homophobia, online aggression, etc. - and it is different to be responsible for dangerous or illegal products posted by business customers. In the first case, moderation is needed and law enforcement authorities should be notified as soon as possible and in the second case, platforms must have the data of companies that sell suspicious products and make them available, if requested, to the relevant authorities and consumers,” says Róża Thun.

Magdalena Piech, on the other hand, points out that "in this context, while primary liability for illegal activity should remain with the user, we recognise that online platforms have a responsibility to address illegal content online." “The future framework should consider the new and diverse actors that have emerged in the online space and update the definition of information society service providers and hosting service providers so that it is clear which kind of services are covered by the law. Any new rules should be technology-neutral to be able to adapt to future developments”, said the chair of EUTA.