The Commission is expected to unveil a set of new rules in early December that, it hopes, will pave the way for a more harmonised digital EU market and make Europe a global leader for rules governing online content moderation. The legislative process, however, is likely to be protracted.
Over the past two decades, the process of digitisation has completely transformed the European services sector, though EU legislation regulating the provision of those services has not kept up with the fast-changing technological environment. With consensus among European policymakers that the 20-year-old piece of legislation, the e-Commerce Directive, was in dire need of updating, the European Commission announced in January 2020 that it would pass a new Digital Services Act by the end of 2020. That date, expected to be December 2, is rapidly approaching.
With this brand new set of regulations governing the EU’s digital market, the Commission intends to clarify and introduce new digital services liability rules and ensure a more competitive digital market where even small and medium-sized businesses (SMEs) can compete with the more established players.
Policymakers in the EU, which is already home to the world’s strictest data privacy laws, believe that Europe is in a unique position to set new standards for the regulation of digital services for the whole world. The forthcoming rules represent an unprecedented strike against the seemingly limitless power of big tech, which are likely to oppose the reforms.
What new rules are coming?
Although the final contours of the legislative package are not yet public knowledge, it is expected that the regulation will come in two legislative proposals. The first set of proposals contained in the Digital Services Act will likely focus on updating digital services providers’ responsibilities and liabilities. The Digital Markets Act will then likely be concerned with limiting the power of big platforms in general.
In a recent speech, Executive Vice-President of the Commission Margrethe Vestager said that digital media platforms need to be more transparent about the way they share the digital world that we see.
“They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad,” Vestager said earlier this year.
Although it is not year clear which specific platforms will be targeted, it is widely expected that the new rules with mainly apply to social media platforms with more than 2 million users, which have, until now, bitterly resisted attempts to disclose their algorithms.
“Platforms need to ensure that their users can be protected from illegal goods and content online, by putting in place the right processes to react swiftly to illegal activities, and to cooperate with law enforcement authorities more effectively,” the Commission’s press officer for the digital economy, Charles Manoury, told BIRN an email.
When asked about the concrete rules being considered in Brussels, Manoury said that the Commission will “aim to harmonise a clear set of obligations (responsibilities) for online platforms, including notice-and-action procedures, redress, transparency and accountability measures, and cooperation obligations.”
In a report produced by the European Parliamentary Research Service in October, EU experts came up with the following recommendations for the Commission:
- Introduce a clear, standardised notice-and-action procedures to deal with illegal and harmful content;
- Enhanced transparency on content curation and reporting obligations for platforms;
- Out-of-court dispute settlement on content management, particularly on notice-and-action procedures.
Those policy recommendations are strikingly similar to the rules already in effect in the country currently holding the Presidency of the Council of the EU – Germany.
German lessons
“The Commission in its impact assessments takes into account already existing EU laws, such as the NetzDG,” noted the Commission’s spokesman Manoury, referring to the Network Enforcement Act, which was passed by the German parliament back in 2017.
According to the website of the German Ministry of Justice and and Consumer Protection, the law aims to fight hate crime and criminally punish fake news and other unlawful content on social networks more effectively. This includes insults, malicious gossip, defamation, public incitement to crime, incitement to hatred, disseminating portrayals of violence and threatening the commission of a felony.
In practice, all social media platforms (with more than 2 million users) that are accessible in Germany are obliged to take down or block access to “manifestly unlawful content” within 24 hours of receiving a complaint. They also have to offer their users an accessible procedure for reporting criminally punishable content and take “immediate notice” of any content that might violate German criminal law.
But German lawmakers didn’t stop there. In June this year, the Budestag decided to tighten further the laws against hate speech online by requiring social networks to report to the BKA (Federal Police) and transmit some user data, such as IP addresses or port numbers, directly to the authorities.
Moreover, new rules will oblige operators of social networks to submit biannual reports on their handling of complaints about criminally punishable content. These reports must contain information, for example, on the volume of complaints and the decision-making practices of the network, as well as about the teams responsible for processing reported content. They must be made available to everybody on the internet.
Social media platforms could be liable for fines of up to 50 million euros if they fail on their reporting duties, according to a statement from the Justice Ministry.
According to the German daily Stuttgarter Zeitung, so far nine social media platforms have offered transparency reports: Facebook, Instagram, Twitter, YouTube, Reddit, Tiktok, Soundcloud, Change.org and Google+. The number of complaints varies greatly. In the second half of 2019, 4,274 unsatisfied users reported to Facebook. There were 843,527 complaints on Twitter and 277,478 on YouTube. Facebook felt compelled to take action in almost a quarter of the cases. 87 per cent of these posts were deleted within 24 hours, a total of 488. Twitter took care of 16 per cent of the complaints, 86 per cent of which were removed from the network within a day, according to the German newspaper.
However, the new obligations have their critics. Some express concern that legal content will end up being deleted by overzealous platforms eager to avoid paying hefty fines, the so-called problem of “over-blocking”. In 2017, when the law was first passed by the German parliament, even journalism unions in Germany protested against it, fearing a new form of censorship.
Reacting to the criticisms, German Justice Minister Christine Lambrecht recently called for the introduction of a “counter-presentation procedure”, which would give authors of deleted content the right to ask social networks for a reassessment of their decision before any fines would be imposed.
There is also criticism that some of the proposed rules might even be in conflict with the German constitution. This particularly concerns the law intended to combat far-right extremism and hate crime, which was passed in the summer and is intended to force operators of social networks to report criminal content such as the threat of dangerous bodily harm or defamation of public figures (mayors or municipal councillors) to the Federal Criminal Police Office. It is because of those concerns that the president has not yet signed the law.
Long way to go
The German experience clearly shows that certain measures to combat the spread of hate speech and other form of illegal content online are relatively easy to implement, while others, like direct reporting to the police, might take much longer to build a consensus around.
That being said, even when it comes to the seemingly more trivial measures, the European Commission’s mission is an infinitely more challenging one. First of all, it needs to make all member states agree on what even constitutes a hate crime on the internet. Then it has to create a set of rules that would be applicable across all member states.
According to a source in the European Commission familiar with the legislation, the first task is the easier one. “There is actually a very broad agreement across the EU on the question of illegal content. Basically, what is illegal offline is also illegal online – it is just a question of how you monitor it and what measures to take to make sure that the rules are followed also online,” the source, who wished to remain anonymous, told BIRN.
Whatever the rules that the Commission ends up proposing in early December, the speed of the final implementation of those measures will largely depend on the legal form of the rules.
Generally speaking, if the rules assume the form of EU regulations, the final implementation might take a very long time, as regulations need unilateral agreement by all member states. If EU legislators decide to go with directives, which leave a lot of space for individual member states to translate into their own respective national laws and don’t require unilateral agreement, things could go much faster.
According to the source from the Commission, half a year is an absolute minimum to expect the legislative process to take.
“If you have an extremely well-drafted piece of legislation that everyone agrees on, it can take half a year. I’ve never heard about anything going faster than this. It is already clear that this will not be very straightforward,” the source said.