News Box

Analysing Algorithms: Bosnian Media Complain of Facebook Guessing Game

Illustration: BIRN/Igor Vujicic

Media depend on Facebook to reach readers, but negotiating its algorithms and content moderation can be a full-time job, editors and journalists in Bosnia and Herzegovina complain.

Media outlets in Bosnia and Herzegovina have a headache, and part of the cause is Facebook.

As anywhere else around the world, the social media giant, owned by Meta, is a vital means for Bosnian media outlets to reach readers.

But editors and journalists in Bosnia tell BIRN they struggle with its inconsistency in content moderation and a lack of transparency about its algorithms. And when they call Facebook for clarification, too often they are left hanging.

“The same content is treated differently on two different Facebook profiles,” said one in an anonymous response to a BIRN survey of newsrooms, referring to the same text posted on the Facebook pages of two different media. “On one, it’s coloured orange [denoting semi-restricted content]. On another, it’s green, without any warnings or restrictions.”

“It’s confusing, and the procedure lacks transparency,” said another. “There’s no explanation; analysing algorithms comes down to experience.”

Analysing algorithms

Posting content that is deemed to violate Facebook rules can have far-reaching consequences for small media outlets, which rely on the platform’s sheer scale to reach an audience and attract advertisers. Repeat occurrences of content being flagged as false or misleading can result in a media’s visibility being reduced, or it being locked out altogether.

Meta’s website states: “Pages, groups, accounts and websites that repeatedly share misinformation will face some restrictions, including having their distribution reduced. This includes content rated False or Altered by fact-checking partners; content that is nearly identical to what fact-checkers have debunked as False or Altered…

But even content that passes the grade must negotiate complex algorithms that push, promote or suppress visibility, determining which feeds it reaches and how often. How these algorithms work exactly is kept under wraps, and they are constantly changing.

Experimentation is the only way to get anywhere close to figuring them out, said social media expert Haris Alisic.

“Algorithms are a business secret of every company in the IT industry,” Alisic told BIRN. “That’s their competitive advantage… And, of course, there is no significant transparency around it. They do share some stuff, but it’s very general. Based on that, it is hard to figure out how the algorithm really works.”


Social media expert Haris Alisic. Photo: Courtesy of Haris Alisic.

According to Facebook, its algorithm uses “hundreds of signals” to make a prediction as to how likely a user is to engage with a post. One of the signals is, ‘Who posted the story?’

Based on these signals, the algorithm creates a “relevancy score” – “our best guess at how meaningful you will find this story”.

Facebook, however, is constantly tweaking its algorithm in response to a range of factors, including news events. Keeping up is a matter of trial and error.

Moderating content

In Bosnia, hate speech, harassment and incitement to violence remain major issues almost three decades since the end of a 1992-95 war. This is reflected in the media, and online.

Most cases of digital rights violations in Bosnia identified in BIRN’s Annual Digital Rights Report 2022 concerned “breaches related to reputation, endangering security, discrimination and hatred, and pressures on individuals because of publishing information on the internet, among others”.

But social media companies on the whole lack detailed understanding of each society in which they operate.

According to a 2022 report on content moderation in Bosnia, published by Article 19, 

“87% of Facebook’s spending on misinformation goes to English-language content, despite the fact that only 9% of its users are English speaking”.

“It has also been revealed that most resources and means in terms of content moderation are being allocated to a limited number of countries.”

Content moderation becomes a problem for media outlets if it is inconsistent or ignorant of local context.

“In their policy explanations, Facebook names different reasons why certain content can be flagged, from hate speech to disinformation,” said another media worker surveyed. 

“But in reality, we could see this kind of content still not removed or marked in any way, so the reader is aware of it. The responsibility of flagging the content is somehow left more to the users [individuals and organisations] and less to the company [Facebook], which is inviting people to its platform to use it but yet doesn’t protect them from this kind of content.”

In order to weed out misinformation and disinformation, Facebook relies on local fact-checkers; in the case of Bosnia, one of its partners is Raskrinkavanje, a team of 14 with backgrounds in media, political sciences, international relations and human rights.

Raskrinkavanje flags content, but it falls to Facebook to limit its reach. Media outlets need to check what they’re posting, said Elma Muric, communications editor at Raskrinkavanje.

“Journalists are not only obliged to disseminate correct information but also to stick to the ethics of journalism and fairly and impartially report about happenings and phenomena in the world,” Muric told BIRN. 

“Journalists who do follow the professional and ethical standards in their reporting, for sure, will not publish anything that could be flagged as disinformation or fake news.”


META’s office in Paris. Photo: Courtesy of Haris Ališić.

Mixed experiences

Besides their headaches in trying to figure out algorithms, some editors and journalists say they also struggle to get answers from Meta itself on any number of issues.

“Tried twice; they were slow and inefficient,” said one. “Didn’t really fix the problem I was having.”

Another said: “Enable simpler contact with Meta and easier access to information, especially algorithmic changes and recommendations.”

“We need clearer procedures regarding algorithm rules,” the journalist added. “There is no alternative, especially since we keep talking about the importance they [Facebook] have regarding the public debate and online participation.”

Not everyone shares such frustration. One journalist who took part in the survey told BIRN about their experience resolving an attempted scam:

 “I reached out to the Meta team and in a very short time I had a phone call with one of their crew, who checked from their side, and besides assuring us that it was a scam, not a real note from them, they shared advice on what to do to protect ourselves immediately, and how to recognise in the future if something is sent from there, or if it’s again some scam.”

Alisic described his own experience dealing with Facebook as “amazing”.

“We had a very close relationship. We still do,” he said. “They were always very professional and very quick to respond. They have very hard-working people.”

Given Facebook’s sheer size, inevitably there will be issues with communication, he said.

“You have to understand that there are tens of millions of pages on Facebook. It is literally impossible for them to be able to respond to everything timely, quickly, or personally, so just like in every other company, I guess they have to prioritise.”

BIRN contacted Meta for a response to this story, but received no reply.

BIRD Community

Are you a professional journalist or a media worker looking for an easily searchable and comprehensive database and interested in safely (re)connecting with more than thousands of colleagues from Southeastern and Central Europe?

We created BIRD Community, a place where you can have it all!

Join Now