‘Who Benefits?’ Inside the EU’s Fight over Scanning for Child Sex Content

In early May 2022, days before she launched one of the most contentious legislative proposals Brussels had seen in years, the European Union’s home affairs commissioner, Ylva Johansson, sent a letter to a US organisation co-founded in 2012 by the movie stars Ashton Kutcher and Demi Moore.

The organisation, Thorn, develops artificial intelligence tools to scan for child sexual abuse images online, and Johansson’s proposed regulation is designed to fight the spread of such content on messaging apps.

“We have shared many moments on the journey to this proposal,” the Swedish politician wrote, according to a copy of the letter addressed to Thorn executive director Julie Cordua and which BIRN has seen.

Johansson urged Cordua to continue the campaign to get it passed: “Now I am looking to you to help make sure that this launch is a successful one.”

That campaign faces a major test in October when Johansson’s proposal is put to a vote in the Civil Liberties Committee of the European Parliament. It has already been the subject of heated debate.

The regulation would obligate digital platforms – from Facebook to Telegram, Signal to Snapchat, TikTok to clouds and online gaming websites – to detect and report any trace of child sexual abuse material, CSAM, on their systems and in their users’ private chats.

It would introduce a complex legal architecture reliant on AI tools for detecting images, videos and speech – so-called ‘client-side scanning’ – containing sexual abuse against minors and attempts to groom children.

Welcomed by some child welfare organisations, the regulation has nevertheless been met with alarm from privacy advocates and tech specialists who say it will unleash a massive new surveillance system and threaten the use of end-to-end encryption, currently the ultimate way to secure digital communications from prying eyes.

The EU’s top data protection watchdog, Wojciech Wiewiorowski, warned Johansson about the risks in 2020, when she informed him of her plans.

They amount to “crossing the Rubicon” in terms of the mass surveillance of EU citizens, he said in an interview for this story. It “would fundamentally change the internet and digital communication as we know it.”

Johansson, however, has not blinked. “The privacy advocates sound very loud,” the commissioner said in a speech in November 2021. “But someone must also speak for the children.”

Based on dozens of interviews, leaked documents and insight into the Commission’s internal deliberations, this investigation connects the dots between the key actors bankrolling and organising the advocacy campaign in favour of Johansson’s proposal and their direct links with the commissioner and her cabinet.

It’s a synthesis that granted certain stakeholders, AI firms and advocacy groups – which enjoy significant financial backing – a questionable level of influence over the crafting of EU policy.

The proposed regulation is excessively “influenced by companies pretending to be NGOs but acting more like tech companies”, said Arda Gerkens, former director of Europe’s oldest hotline for reporting online CSAM.

“Groups like Thorn use everything they can to put this legislation forward, not just because they feel that this is the way forward to combat child sexual abuse, but also because they have a commercial interest in doing so.”

If the regulation undermines encryption, it risks introducing new vulnerabilities, critics argue. “Who will benefit from the legislation?” Gerkens asked. “Not the children.”

Privacy assurances ‘deeply misleading’


The Action Day promoted by Brave Movement in front of the EP. Photo: Justice Initiative

Star of That ‘70s Show and a host of Hollywood hits, 45-year-old Kutcher resigned as chairman of the Thorn board in mid-September amid uproar over a letter he wrote to a judge in support of convicted rapist and fellow That ‘70s Show actor Danny Masterson, prior to his sentencing.

Up until that moment, however, Kutcher had for years been the very recognisable face of a campaign to rid the Internet of CSAM, a role that involved considerable access to the top brass in Brussels.

Thorn’s declarations to the EU transparency register lists meetings with senior members of the cabinets of top Commission officials with a say in the bloc’s security or digital policy, including Johansson, antitrust czar Margrethe Vestager, Commission Vice-President Margaritis Schinas, and internal market commissioner Thierry Breton.

In November 2020, it was the turn of Commission President Ursula von der Leyen, who was part of a video conference with Kutcher and an organisation registered in the small Dutch town of Lisse – the WeProtect Global Alliance.

Though registered in the EU lobby database as a charity, Thorn sells its AI tools on the market for a profit; since 2018, the US Department of Homeland Security, for example, has purchased software licences from Thorn for a total of $4.3 million.

These tools are used by companies such as Vimeo, Flickr and OpenAI – the creator of chatbot ChatGPT and one of many beneficiaries of Kutcher’s IT investments – and by law enforcement agencies across the globe.

In November 2022, Kutcher and Johansson lined up as key speakers at a summit organised and moderated by then European Parliament Vice President Eva Kaili, who three weeks later was arrested and deposed over an investigation into the ‘Qatargate’ cash-for-lobbying scandal.

In March this year, six months before his resignation amid uproar over his letter of support for Masterson, Kutcher addressed lawmakers in Brussels, seeking to appease concerns about the possible misuse and shortcomings of the existing technology. Technology can scan for suspicious material without violating privacy, he said, a claim that the European Digital Rights association said was “deeply misleading”.

The Commission has been reluctant to detail the relationship between Thorn and Johansson’s cabinet under the EU’s freedom of information mechanism. It refused to disclose Cordua’s emailed response to Johansson’s May 2022 letter or a ‘policy one pager’ Thorn had shared with her cabinet, citing Thorn’s position that “the disclosure of the information contained therein would undermine the organisation’s commercial interest”.

After seven months of communication concerning access to documents and the intervention of the European Ombudsman, in early September the Commission finally released a series of email exchanges between Johansson’s Directorate-General for Migration and Home Affairs and Thorn.

The emails reveal a continuous and close working relationship between the two sides in the months following the roll out of the CSAM proposal, with the Commission repeatedly facilitating Thorn’s access to crucial decision-making venues attended by ministers and representatives of EU member states.

The European Ombudsman is looking into the Commission’s refusal to grant access to a host of other internal documents pertaining to Johansson’s proposal.

FGS Global, a major lobbying firm hired by Thorn and paid at least 600,000 euros in 2022 alone, said Thorn would not comment for this story. Johansson also did not respond to an interview request.

Enter ‘WeProtect Global Alliance’


Photo: Courtesy of Solomon.

Among the few traces of Thorn’s activities in the EU’s lobby transparency register is a contribution of 219,000 euros in 2021 to the WeProtect Global Alliance, the organisation that had a video conference with Kutcher and Von der Leyen in late 2020.

WeProtect is the offspring of two governmental initiatives – one co-founded by the Commission and the United States, the other by Britain.

They merged in 2016 and, in April 2020, as momentum built for legislation to CSAM with client-side scanning technology, WeProtect was transformed from a British government-funded entity into a putatively independent ‘foundation’ registered at a residential address in Lisse, on the Dutch North Sea coast.

Its membership includes powerful security agencies, a host of governments, Big Tech managers, NGOs, and one of Johansson’s most senior cabinet officials, Antonio Labrador Jimenez, who heads the Commission’s team tasked with fighting CSAM.

Minutes after the proposed regulation was unveiled in May last year, Labrador Jimenez emailed his Commission colleagues: “The EU does not accept that children cannot be protected and become casualties of policies that put any other values or rights above their protection, whatever these may be.”

He said he was looking forward to “seeing many of you in Brussels during the WeProtect Global Alliance summit” the following month.

Labrador Jimenez officially joined the WeProtect Policy Board in July 2020, after the Commission decided to join and fund it as “the central organisation for coordinating and streamlining global efforts and regulatory improvements” in the fight against CSAM. WeProtect public documents, however, show Labrador Jimenez participating in WeProtect board meetings in December 2019.

Commenting on this story, the Commission said Labrador Jimenez “does not receive any kind of compensation for his participation in the WeProtect Global Alliance Management Board, and performs this function as part of his duties at the Commission”.

Labrador Jimenez’s position on the WeProtect Board, however, raises questions about how the Commission uses its participation in the organisation to promote Johannson’s proposal.

When Labrador Jimenez briefed fellow WeProtect Board members about the proposed regulation in July 2022, notes from the meeting show that “the Board discussed the media strategy of the legislation”.

Labrador Jimenez has also played a central role in drafting and promoting Johansson’s regulation, the same proposal that WeProtect is actively campaigning for with EU funding. And next to him on the board sits Thorn’s Julie Cordua, as well as government officials from the US and Britain [the latter currently pursuing its own Online Safety Bill], Interpol, and United Arab Emirates colonel, Dana Humaid Al Marzouqi, who chairs or participates in numerous international police task forces.

Between 2020 and 2023, Johansson’s Directorate-General awarded almost 1 million euros to WeProtect to organise the June 2022 summit in Brussels, which was dedicated to the fight against CSAM and activities to enhance law enforcement collaboration.

WeProtect did not reply directly to questions concerning its funding arrangements with the Commission or to what extent its advocacy strategies are shaped by the governments and stakeholders sitting on its policy board.

In a statement, it said it is led “by a multi-stakeholder Global Policy Board; members include representatives from countries, international and civil society organisations, and the technology industry.”

The financing


Photo: Courtesy of Solomon.

Another member of the WeProtect board alongside Labrador Jimenez is Douglas Griffiths, a former official of the US State Department and currently president of the Geneva-based Oak Foundation, a group of philanthropic organisations around the world providing grants “to make the world a safer, fairer, and more sustainable place to live”.

Oak Foundation has provided WeProtect with “generous support for strategic communications”, according to WeProtect financial statements from 2021.

From Oak Foundation’s annual financial reports, it is clear it has a long-term commitment to aiding NGOs tackling child abuse. It is also funding the closely linked network of civil society organisations and lobby groups promoting Johansson’s proposed regulation, many of which have helped build an umbrella entity called the European Child Sexual Abuse Legislation Advocacy Group, ECLAG.

ECLAG, which launched its website a few weeks after Johansson’s proposal was announced in May 2022, acts as a coordination platform for some of the most active organisations lobbying in favour of the CSAM legislation. Its steering committee includes Thorn and a host of well-known children’s rights organisations such as ECPAT, Eurochild, Missing Children Europe, Internet Watch Foundation, and Terre des Hommes.

Another member is Brave Movement, which came into being in April 2022, a month before’s Johansson’s regulation was rolled out, thanks to a $10.3 million contribution by the Oak Foundation to Together for Girls, a US-based non-profit that fights sexual violence against children.

Oak Foundation has also given to Thorn – $5 million in 2019. In 2020, it gave $250,000 to ECPAT to engage “policy makers to include children’s interests in revisions to the Digital Services Act and on the impact of end-to-end encryption” and a further $100,000 in support of efforts to end “the online child sexual abuse and exploitation of children in the digital space”. The same year it authorised a $990,000 grant to Eurochild, another NGO coalition that campaigns for children’s rights in Brussels.

In 2021, Oak Foundation gave Thorn a further $250,000 to enhance its coordinating role in Brussels with the aim of ensuring “that any legislative solutions and instruments coming from the EU build on and enhance the existing ecosystem of global actors working to protect children online”.

In 2022, the foundation granted ECPAT a three-year funding package of $2.79 million “to ensure that children’s rights are placed at the centre of digital policy processes in the European Union”. The WeProtect Global Alliance received $2.33 million, also for three years, “to bring together governments, the private sector, civil society, and international organisations to develop policies and solutions that protect children from sexual exploitation and abuse online”.

In a response for this story, Oak Foundation said it does not “advocate for proposed legislation nor work on the details of those policy recommendations”.

It did not respond directly to questions concerning the implications of Johansson’s regulation on privacy rights. A spokesperson said the foundation supports organisations that “advocate for new policies, with a specific focus in the EU, US, and UK, where opportunities exist to establish precedent for other governments”.

Divide and conquer’

Brave Movement’s internal advocacy documents lay out a comprehensive strategy for utilising the voices of abuse survivors to leverage support for Johansson’s proposal in European capitals and, most importantly, within the European Parliament, while targeting prominent critics.

The organisation has enjoyed considerable access to Johansson. In late April 2022, it hosted the Commissioner in an online ‘Global Survivors Action Summit’ – a rare feat in the Brussels bubble for an organisation that was launched just weeks earlier.

An internal strategy document from November 2022 the same year leaves no doubts about the organisation’s role in rallying support for Johansson’s proposal.

“The main objective of the Brave Movement mobilisation around this proposed legislation is to see it passed and implemented throughout the EU,” it states.

“If this legislation is adopted, it will create a positive precedent for other countries… which we will invite to follow through with similar legislation.”

In April this year, the Brave Movement held an ‘Action Day’ outside the European Parliament, where a group of survivors of online child sexual abuse were gathered “to demand EU leaders be brave and act to protect millions of children at risk from the violence and trauma they faced”.

Johansson joined the photo-op.

Survivors of such abuse are key to the Brave Movement’s strategy of winning over influential MEPs.

“Once the EU Survivors taskforce is established and we are clear on the mobilised survivors, we will establish a list pairing responsible survivors with MEPs – we will ‘divide and conquer’ the MEPs by deploying in priority survivors from MEPs’ countries of origin,” its advocacy strategy reads.

Conservative Spanish MEP Javier Zarzalejos, the lead negotiator on the issue in the parliament, according to the Brave Movement strategy has called for “strong survivors’ mobilisation in key countries like Germany”.

Brave Movement’s links with the Directorate-General for Migration and Home Affairs goes deeper still: its Europe campaign manager, Jessica Airey, worked on communications for the Directorate-General between October 2022 and February 2023, promoting Johansson’s regulation.

According to her LinkedIn profile, Airey worked “closely with the policy team who developed the [child sexual abuse imagery] legislation in D.4 [where Labrador Jimenez works] and partners like Thorn”.

She also “worked horizontally with MEPs, WeProtect Global Alliance, EPCAT”.

Asked about a possible conflict of interest in Airey’s work for Brave Movement on the same legislative file, the European Commission responded that Airey was appointed as a trainee and so no formal permission was required. It did say, however, that “trainees must maintain strict confidentiality regarding all knowledge acquired during training. Unauthorised disclosure of non-public documents or information is strictly prohibited, with this obligation extending beyond the training period.”

Brave Movement said it is “proud of the diverse alliances we have built and the expert team we have recruited, openly, to achieve our strategic goals”, pointing out that last year alone one online safety hotline received 32 million reports of child sexual abuse content.

Brave Movement has enlisted expert support: its advocacy strategy was drafted by UK consultancy firm Future Advocacy, while its ‘toolkit’, which aims to “build a beating drum of support for comprehensive legislation that protects children” in the EU, was drafted with the involvement of Purpose, a consultancy whose European branch is controlled by French Capgemini SE.

Purpose specialises in designing campaigns for UN agencies and global companies, using “public mobilisation and storytelling” to “shift policies and change public narratives.

Beginning in 2022, the Oak Foundation gave Purpose grants worth $1.9 million to “help make the internet safer for children”.

Since April 2022, Purpose representatives have met regularly with ECLAG – the network of civil society groups and lobbyists – to refine a pan-European communications strategy.

Documents seen by this investigation also show they met with members of Johansson’s team.

A ‘BeBrave Europe Task Force’ meeting in January this year involved the ECLAG steering group, Purpose EU, Justice Initiative and Labrador Jimenez’s unit within the Directorate-General. In 2023 the foundation that launched the Justice Initiative, the Guido Fluri Foundation, received $416,667 from Oak Foundation.

The Commission, according to its own notes of the meeting, “recommended that when speaking with stakeholders of the negotiation, the organisations should not forget to convey a sense of urgency on the need to find an agreement on the legislation this year”.

This coordinated messaging resulted this year in a social media video featuring Johansson, Zarzalejos, and representatives of the organisations behind ECLAG promoting a petition in favour of her regulation.

Disproportionate infringement of rights

Some 200 kilometres north from Brussels, in the Dutch city of Amsterdam, a bright office on the edge of the city’s famous red light district marks the frontline of the fight to identify and remove CSAM in Europe.

‘Offlimits’, previously known as the Online Child Abuse Expertise Agency, or EOKM, is Europe’s oldest hotline for children and adults wanting to report abuse, whether happening behind closed doors or seen on video circulating online.

In 2022, its seven analysts processed 144,000 reports, and 60 per cent concerned illegal content. The hotline sends requests to remove the content to web hosting providers and, if the material is considered particularly serious, to the police and Interpol.

Offlimits director between 2015 and September this year, Arda Gerkens is deeply knowledgeable of EU policy on the matter. Yet unlike the likes of Thorn, she had little luck accessing Johansson.

“I invited her here but she never came,” said Gerkens, a former Socialist Party MP in the Dutch parliament.

“Commissioner Johansson and her staff visited Silicon Valley and big North American companies,” she said. Companies presenting themselves as NGOs but acting more like tech companies have influenced Johansson’s regulation, Gerkens said, arguing that Thorn and groups like it “have a commercial interest”.

Gerkens said that the fight against child abuse must be deeply improved and involve an all-encompassing approach that addresses welfare, education, and the need to protect the privacy of children, along with a “multi-stakeholder approach with the internet sector”.

“Encryption,” she said, “is key to protecting kids as well: predators hack accounts searching for images”.

It’s a position reflected in some of the concerns raised by the Dutch in ongoing negotiations on a compromise text at the EU Council, arguing in favour of a less intrusive approach that protects encrypted communication and addresses only material already identified and designated as CSAM by monitoring groups and authorities.

A Dutch government official, speaking on condition of anonymity, said: “The Netherlands has serious concerns with regard to the current proposals to detect unknown CSAM and address grooming, as current technologies lead to a high number of false positives.”

“The resulting infringement of fundamental rights is not proportionate.”

Self-interest

In June 2022, shortly after the roll out of Johansson’s proposal, Thorn representatives sat down with one of the commissioner’s cabinet staff, Monika Maglione. An internal report of the meeting, obtained for this investigation, notes that Thorn was interested to understand how “bottlenecks in the process that goes from risk assessment to detection order” would be dealt with.

Detection orders are a crucial component of the procedure set out within Johansson’s proposed regulation, determining the number of people to be surveilled and how often.

European Parliament sources say that in technical meetings, Zarzalejos, the rapporteur on the proposal, has argued in favour of detection orders that do not necessarily focus on individuals or groups of suspects, but are calibrated to allow scanning for suspicious content.

This, experts say, would unlock the door to the general monitoring of EU citizens, otherwise known as mass surveillance.

Asked to clarify his position, Zarzalejos’ office responded: “The file is currently being discussed closed-doors among the shadow rapporteurs and we are not making any comments so far”.

In the same meeting with Maglione, Thorn representatives expressed a “willingness to collaborate closely with COM [European Commission] and provide expertise whenever useful, in particular with respect to the creation of the database of indicators to be hosted by the EU Centre” as well as to prepare “communication material on online child sexual abuse”.

The EU Centre to Prevent and Combat Child Sexual Abuse, which would be created under Johansson’s proposal, would play a key role in helping member states and companies implement the legislation; it would also vet and approve scanning technologies, as well as purchase and offer them to small and medium companies.

As a producer of such scanning technologies, a role for Thorn in supporting the capacity building of the EU Centre database would be of significant commercial interest to the company.

Meredith Whittaker, president of Signal Foundation, the US non-for-profit foundation behind the Signal encrypted chat application, says that AI companies that produce scanning systems are effectively promoting themselves as clearing houses and a liability buffer for big tech companies, sensing the market potential.

“The more they frame this as a huge problem in the public discourse and to regulators, the more they incentivise large tech companies to outsource their dealing of the problems to them,” Whittaker said in an interview for this story.

Effectively, such AI firms are offering tech companies a “get out of responsibility free card”, Whittaker said, by telling them, “’You pay us (…) and we will host the hashes, we will maintain the AI system, we will do whatever it is to magically clean up this problem”.

“So it’s very clear that whatever their incorporation status is, that they are self-interested in promoting child exploitation as a problem that happens “online,” and then proposing quick (and profitable) technical solutions as a remedy to what is in reality a deep social and cultural problem. (…) I don’t think governments understand just how expensive and fallible these systems are, that we’re not looking at a one-time cost. We’re looking at hundreds of millions of dollars indefinitely due to the scale that this is being proposed at.”

Lack of scientific input


Photo by Alexas_Fotos/Pixabay

Johansson has dismissed the idea that the approach she advocates will unleash something new or extreme, telling MEPs last year that it was “totally false to say that with a new regulation there will be new possibilities for detection that don’t exist today”.

But experts question the science behind it.

Matthew Daniel Green, a cryptographer and security technologist at John Hopkins University, said there was an evident lack of scientific input into the crafting of her regulation.

“In the first impact assessment of the EU Commission there was almost no outside scientific input and that’s really amazing since Europe has a terrific scientific infrastructure, with the top researchers in cryptography and computer security all over the world,” Green said.

AI-driven scanning technology, he warned, risks exposing digital platforms to malicious attacks and would undermine encryption.

“If you touch upon built-in encryption models, then you introduce vulnerabilities,” he said. “The idea that we are going to be able to have encrypted conversations like ours is totally incompatible with these scanning automated systems, and that’s by design.”

In a blow to the advocates of AI-driven CSAM scanning, US tech giant Apple said in late August that it is impossible to implement CSAM-scanning while preserving the privacy and security of digital communications. The same month, UK officials privately admitted to tech companies that there is no existing technology able to scan end-to-end encrypted messages without undermining users’ privacy.

According to research by Imperial College academics Ana-Maria Cretu and Shubham Jain, published last May, AI driven Client Side Scanning systems could be quietly tweaked to perform facial recognition on user devices without the user’s knowledge. They warned of more vulnerabilities that have yet to be identified.

“Once this technology is rolled out to billions of devices across the world, you can’t take it back”, they said.

Law enforcement agencies are already considering the possibilities it offers.

In July 2022, the head of Johansson’s Directorate-General, Monique Pariat, visited Europol to discuss the contribution the EU police agency could make to the fight against CSAM, in a meeting attended by Europol executive director Catherine de Bolle.

Europol officials floated the idea of using the proposed EU Centre to scan for more than just CSAM, telling the Commission, “There are other crime areas that would benefit from detection”. According to the minutes, a Commission official “signalled understanding for the additional wishes” but “flagged the need to be realistic in terms of what could be expected, given the many sensitivities around the proposal.”

Ross Anderson, professor of Security Engineering at Cambridge University, said the debate around AI-driven scanning for CSAM has overlooked the potential for manipulation by law enforcement agencies.

“The security and intelligence community have always used issues that scare lawmakers, like children and terrorism, to undermine online privacy,” he said.

“We all know how this works, and come the next terrorist attack, no lawmaker will oppose the extension of scanning from child abuse to serious violent and political crimes.”

This investigation was supported by a grant from the IJ4EU fund. It is also published by Die Zeit, Le Monde, De Groene Amsterdammer, Solomon, IRPI Media and El Diario.

Share Your Experience: Children and Youth Safety on TikTok

We would like to hear from parents and teachers willing to share their experience with us to help in an upcoming investigation into the safety of children and young teenagers using TikTok. 

Scroll down for more information about how to take part.

The key things we want to know:

  • What steps did parents take to protect their children and young teenagers on the platform?
  • Were there any cases in which children and young teenagers were the targets of bullying, identity theft, privacy issues etc.?
  • If/how the potential danger in the digital environment is harming their childrens’ physical safety?
  • What do teachers know about the network and how do they educate children about it?

We will not publish any documents or names without prior consent and we do not plan to use specific examples, but rather show more general systemic problems. Your responses are secure and encrypted.

Your stories will be used to help us with an ongoing investigation.

How to take part?

To submit your experience, all you need to do is fill out this form. The questionnaire is available in English, but also in Albanian, Bosnian/Croatian/Montenegrin/Serbian and Macedonian.

You can also contact us via email: readerstories@birn.eu.com.

Or you can reach us on social media…

FB: @balkaninsight

TW: @balkaninsight

Child Pornography Offences Increase in Romania During Pandemic

The Romanian Directorate for Investigating Organised Crime and Terrorism, DIICOT said on Friday that there has been an increase in the detected production and distribution of pornographic material featuring minors, as freedom of movement limitations bring about by the pandemic led to a dramatic increase in online interactions.

“The number of pornographic materials with minors detected by prosecution bodies and even by the private sector is on the rise, which demands that we concentrate our efforts in combating this kind of criminal activity,” the DIICOT said in its report for 2020.

The report differentiates between content produced with the participation of the perpetrators and that which has been “self-generated” by minors themselves.

Material self-generated material became more prevalent in 2020, when a growing number of offenders convinced or blackmailed the victims into filming or photographing themselves engaging in obscene acts. In most of such instances, the minors were approached online.

Prosecutors also observed “an upsurge” in the use of livestreaming services among minors who produce pornography motivated by the “significant financial gains” they obtain.

In February 2021, DIICOT has already reported five child pornography cases.

On February 2, a suspect was arrested in the eastern county of Buzau for allegedly approaching a female minor through a social network from whom he obtained several pictures and videos of a sexual nature that he then distributed online.

On February 11, another suspect was apprehended in the north of Romania on charges of blackmail, child pornography and corrupting a child. According to prosecutors, between August 2020 and February 2021 the suspect recruited an unspecified number of minors online to send pornographic content to him.

The suspect then used the images as tools of blackmail to threaten the children to supply him with more material, prosecutors alleged. He has been remanded in custody for 30 days and will face trial.

BIRD Community

Are you a professional journalist or a media worker looking for an easily searchable and comprehensive database and interested in safely (re)connecting with more than thousands of colleagues from Southeastern and Central Europe?

We created BIRD Community, a place where you can have it all!

Join Now