EU Set to Take on Big Tech with New Digital Services Act

Over the past two decades, the process of digitisation has completely transformed the European services sector, though EU legislation regulating the provision of those services has not kept up with the fast-changing technological environment. With consensus among European policymakers that the 20-year-old piece of legislation, the e-Commerce Directive, was in dire need of updating, the European Commission announced in January 2020 that it would pass a new Digital Services Act by the end of 2020. That date, expected to be December 2, is rapidly approaching.

With this brand new set of regulations governing the EU’s digital market, the Commission intends to clarify and introduce new digital services liability rules and ensure a more competitive digital market where even small and medium-sized businesses (SMEs) can compete with the more established players.

Policymakers in the EU, which is already home to the world’s strictest data privacy laws, believe that Europe is in a unique position to set new standards for the regulation of digital services for the whole world. The forthcoming rules represent an unprecedented strike against the seemingly limitless power of big tech, which are likely to oppose the reforms.

A close-up image shows the slogan of the ‘StopHateForProfit’ campaign on the organization’s website displayed on a smartphone screen in Cologne, Germany, 29 June 2020. EPA-EFE/SASCHA STEINBACH

What new rules are coming?  

Although the final contours of the legislative package are not yet public knowledge, it is expected that the regulation will come in two legislative proposals. The first set of proposals contained in the Digital Services Act will likely focus on updating digital services providers’ responsibilities and liabilities. The Digital Markets Act will then likely be concerned with limiting the power of big platforms in general.

In a recent speech, Executive Vice-President of the Commission Margrethe Vestager said that digital media platforms need to be more transparent about the way they share the digital world that we see.

“They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad,” Vestager said earlier this year.

Although it is not year clear which specific platforms will be targeted, it is widely expected that the new rules with mainly apply to social media platforms with more than 2 million users, which have, until now, bitterly resisted attempts to disclose their algorithms.

“Platforms need to ensure that their users can be protected from illegal goods and content online, by putting in place the right processes to react swiftly to illegal activities, and to cooperate with law enforcement authorities more effectively,” the Commission’s press officer for the digital economy, Charles Manoury, told BIRN an email.

When asked about the concrete rules being considered in Brussels, Manoury said that the Commission will “aim to harmonise a clear set of obligations (responsibilities) for online platforms, including notice-and-action procedures, redress, transparency and accountability measures, and cooperation obligations.”

In a report produced by the European Parliamentary Research Service in October, EU experts came up with the following recommendations for the Commission:

  1. Introduce a clear, standardised notice-and-action procedures to deal with illegal and harmful content;
  2. Enhanced transparency on content curation and reporting obligations for platforms;
  3. Out-of-court dispute settlement on content management, particularly on notice-and-action procedures.

Those policy recommendations are strikingly similar to the rules already in effect in the country currently holding the Presidency of the Council of the EU – Germany.

A Google logo is displayed at the Google offices in Berlin, Germany, 24 June 2019. EPA-EFE/HAYOUNG JEON

German lessons

 “The Commission in its impact assessments takes into account already existing EU laws, such as the NetzDG,” noted the Commission’s spokesman Manoury, referring to the Network Enforcement Act, which was passed by the German parliament back in 2017.

According to the website of the German Ministry of Justice and and Consumer Protection, the law aims to fight hate crime and criminally punish fake news and other unlawful content on social networks more effectively. This includes insults, malicious gossip, defamation, public incitement to crime, incitement to hatred, disseminating portrayals of violence and threatening the commission of a felony.

In practice, all social media platforms (with more than 2 million users) that are accessible in Germany are obliged to take down or block access to “manifestly unlawful content” within 24 hours of receiving a complaint. They also have to offer their users an accessible procedure for reporting criminally punishable content and take “immediate notice” of any content that might violate German criminal law.

But German lawmakers didn’t stop there. In June this year, the Budestag decided to tighten further the laws against hate speech online by requiring social networks to report to the BKA (Federal Police) and transmit some user data, such as IP addresses or port numbers, directly to the authorities.

Moreover, new rules will oblige operators of social networks to submit biannual reports on their handling of complaints about criminally punishable content. These reports must contain information, for example, on the volume of complaints and the decision-making practices of the network, as well as about the teams responsible for processing reported content. They must be made available to everybody on the internet.

Social media platforms could be liable for fines of up to 50 million euros if they fail on their reporting duties, according to a statement from the Justice Ministry.

According to the German daily Stuttgarter Zeitung, so far nine social media platforms have offered transparency reports: Facebook, Instagram, Twitter, YouTube, Reddit, Tiktok, Soundcloud, Change.org and Google+. The number of complaints varies greatly. In the second half of 2019, 4,274 unsatisfied users reported to Facebook. There were 843,527 complaints on Twitter and 277,478 on YouTube. Facebook felt compelled to take action in almost a quarter of the cases. 87 per cent of these posts were deleted within 24 hours, a total of 488. Twitter took care of 16 per cent of the complaints, 86 per cent of which were removed from the network within a day, according to the German newspaper.

However, the new obligations have their critics. Some express concern that legal content will end up being deleted by overzealous platforms eager to avoid paying hefty fines, the so-called problem of “over-blocking”. In 2017, when the law was first passed by the German parliament, even journalism unions in Germany protested against it, fearing a new form of censorship.

Reacting to the criticisms, German Justice Minister Christine Lambrecht recently called for the introduction of a “counter-presentation procedure”, which would give authors of deleted content the right to ask social networks for a reassessment of their decision before any fines would be imposed.

There is also criticism that some of the proposed rules might even be in conflict with the German constitution. This particularly concerns the law intended to combat far-right extremism and hate crime, which was passed in the summer and is intended to force operators of social networks to report criminal content such as the threat of dangerous bodily harm or defamation of public figures (mayors or municipal councillors) to the Federal Criminal Police Office. It is because of those concerns that the president has not yet signed the law.

Long way to go

The German experience clearly shows that certain measures to combat the spread of hate speech and other form of illegal content online are relatively easy to implement, while others, like direct reporting to the police, might take much longer to build a consensus around.

That being said, even when it comes to the seemingly more trivial measures, the European Commission’s mission is an infinitely more challenging one. First of all, it needs to make all member states agree on what even constitutes a hate crime on the internet. Then it has to create a set of rules that would be applicable across all member states.

According to a source in the European Commission familiar with the legislation, the first task is the easier one. “There is actually a very broad agreement across the EU on the question of illegal content. Basically, what is illegal offline is also illegal online – it is just a question of how you monitor it and what measures to take to make sure that the rules are followed also online,” the source, who wished to remain anonymous, told BIRN.

Whatever the rules that the Commission ends up proposing in early December, the speed of the final implementation of those measures will largely depend on the legal form of the rules.

Generally speaking, if the rules assume the form of EU regulations, the final implementation might take a very long time, as regulations need unilateral agreement by all member states. If EU legislators decide to go with directives, which leave a lot of space for individual member states to translate into their own respective national laws and don’t require unilateral agreement, things could go much faster.

According to the source from the Commission, half a year is an absolute minimum to expect the legislative process to take.

“If you have an extremely well-drafted piece of legislation that everyone agrees on, it can take half a year. I’ve never heard about anything going faster than this. It is already clear that this will not be very straightforward,” the source said.

Brussels Greenlights Contentious Media Sale in Central Europe

The European Commission on Wednesday approved the sale of Central European Media Enterprises, CME, to the PPF Group conglomerate, whose owner, the Czech Republic’s richest businessman, Peter Kellner, has been accused of acting as a proxy for China.

The CME, majority owned by AT&T, operates 30 television channels in five Central and East European markets.

Civil society groups earlier warned that the sale could boost China’s influence on the TV sector in Central and Eastern European countries where both groups are present. Concerns have also been raised over potential market concentration.

The EU executive body dismissed these objections, however. “Based on its market investigation, the Commission found that the transaction, as notified, would not impact the companies’ position in these markets,” a statement on its website said. 

“PPF and CME are both active in the acquisition of sports broadcasting rights in Czechia and Slovakia and in the sale of advertising space in Czechia,” the statement added. “In parallel, the two companies are active at different levels of the TV value chain,” it continued.

“CME is mainly active as a wholesale supplier of TV channels in a number of Member States, while PPF offers retail audio-visual and telecommunications services in Bulgaria, Czechia and Slovakia,” it asserted.

These elements pose no real risk to fair competition, the Commission went on, as “the companies generally do not compete for the acquisition of the same sports rights and the transaction would only lead to a limited increase in PPF’s existing share of the market.

“Similarly, PPF’s activities represent a negligible share and would not add significantly to CME’s position in the market for the sale of advertising space in Czechia.”

The sale will give Kellner’s group control over leading private television stations now owned by CME in Bulgaria, Romania, Slovenia and Slovakia. CME’s main channel in Slovakia, Markiza TV, is widely considered a rare independent television station in the country.

PPF has already interests in the audiovisual and telecommunications sectors in some of these countries.

The deal was signed in October last year. A PPF representative said on Wednesday that the group expected the purchase to be finalised on October 13.

Last February, US Republican Senator Marco Rubio, a known China hawk, urged the US authorities to launch “a full review of the national security implications” of the sale.

Rubio insisted that the deal would advance “the Chinese Communist Party’s political interference” in the countries where CME operates.

If the sale to PPF went ahead, Rubio observed, Kellner’s group would control of channels with a massive audience of 97 million people only in Romania and Bulgaria, where CME owns rating market leaders Pro TV and b1.

China in July announced retaliatory sanctions on Rubio, fellow Republican Senator Ted Cruz and other US officials for their harsh criticism of its policies.

Kellner has often been accused of serving China’s interests in Czechia, where his PPF group has its base. These services are said to include whitewashing Beijing’s record through a paid propaganda campaign in the Czech media.

BIRD Community

Are you a professional journalist or a media worker looking for an easily searchable and comprehensive database and interested in safely (re)connecting with more than thousands of colleagues from Southeastern and Central Europe?

We created BIRD Community, a place where you can have it all!

Join Now