‘Predictive justice’ might just be the tool to accelerate Serbia’s sluggish justice system. But experts are warning of considerable dangers and say the public must have a say.
The wheels of justice in Serbia sorely need speeding up. But when President Aleksandar Vucic told reporters last month that it would be “very important” to introduce artificial intelligence into the courts, not everyone was reassured.
Vucic’s remark about ‘predictive justice’ and the advent of “new, real and important changes” came in the context of a year-end press conference covering the full gamut of government policy, so he did not dwell on the details.
Now some digital rights activists and legal experts are sounding the alarm about the need to put the issue to full public debate, after an extensive Chinese-built network of surveillance cameras was rolled out in the capital, Belgrade, in 2019 to the surprise of unaware residents.
From identifying likely re-offenders to catching welfare fraudsters, predictive justice is a fast-growing phenomenon, alarming rights organisations that warn that such software can encourage racial profiling and discrimination and threaten privacy and freedom of expression.
Lawyer Djordje Krivokapic, co-founder of the Belgrade-based digital rights NGO SHARE Foundation, said AI has uses in courts in terms of case-management, automation and assistance in decision-making. But its introduction needs to be properly debated, he said.
“This represents a serious change in our society and some public debate and public discussion on this issue in general should be initiated regardless of the level at which it is discussed – except perhaps at the first level when some types of predictive algorithms are used in case-management to speed up the justice system and make it more efficient,” Krivokapic told BIRN.
He warned of the potential for discrimination. “Artificial intelligence and machine-learning algorithms have a lot of specifics that can lead to increased discrimination – or new forms of discrimination – and special attention must be paid to this.”
The justice ministry did not respond to a request for comment.
Uses and abuses
Serbia is already laying the ground for the use of AI in its public sector. In December 2019, the government adopted a strategy to develop the field over the period 2020-2025 and an Action Plan to enact the strategy was passed in June 2020.
Under the plan, the government will establish an Artificial Intelligence Council in the first quarter of this year. Neither document, however, discusses in detail the use of AI in the Serbian court system, which is notoriously slow and prone to political interference.
Besides a legislative framework, Serbia has also begun to automate case-storage and institutional communication in the judiciary.
Lawyer Milena Vasic from the Lawyers’ Committee for Human Rights-YUCOM, said AI was becoming “a kind of inevitability in almost all areas of life”, and criminal justice could not be an exception.
“In particular, we should keep in mind simpler cases, such as, for example, we now have thousands of lawsuits against banks for loan processing costs that have practically buried the judiciary, or mass lawsuits that most often occur due to a mistake by the state,” Vasic told BIRN.
“Certainly, the use of artificial intelligence could make it easier to manage such cases, but since it is software, we should also talk about potential abuses or artificially raising the number of resolved cases.”
Ana Toskic Cvetinovic, executive director of the NGO Partners for Democratic Change Serbia, also warned of potential issues around discrimination of marginalised groups.
“Regarding the use of AI in the judiciary, it raises a number of other issues such as the impact on access to justice and the right to a fair trial, or free judicial conviction, even when AI is used to support decision-making, and especially if it is AI that would possibly replace judges,” Toskic Cvetinovic told BIRN.
Some forms of predictive justice simply cannot yet be applied in Serbia, YUCOM’s Vasic said.
“In our law, case law is still not a formal source of law and we have a lot of problems with harmonisation of case law,” she said. “What is crucial, however, is to harmonise the position on case law at the ‘human’ level before resolving cases with new methods involving artificial intelligence.”
“Such systems can be easily imagined in countries of the common law system,” Vasic noted, but “even there they suffer serious criticism for violating the right of citizens to a fair trial and are still in the so-called test phase.”
Lack of transparency
Part of map of smart cameras in Belgrade, view on city center. Screenshot: hiljade.kamera.rs
Serbia is already pursuing greater automation, for example in terms of parking in Belgrade.
In August last year, authorities went live with a system named ‘Falcon Eye’ involving 20 specialised cars equipped with cameras that can identify improperly parked cars and take photos, resulting in fines for the registered owners sent by post. Then there’s the Chinese ‘Safe City’ network of surveillance cameras with the potential for licence plate recognition and facial recognition.
There has little or no public debate about the use of such technology, the introduction of which has been criticised as lacking transparency.
Toskic Cvetinovic warned that the “flaws” of AI would be magnified in Serbia given the country’s poor record of protecting human rights.
“In addition, the protection of citizens’ privacy has so far not been in focus when planning or implementing projects that involve mass processing of personal data,” Toskic Cvetinovic told BIRN.
“What worries me most is the fact that the most flagrant violations of this right came from institutions that have public functions, so the trust in new similar projects has been shaken, and with good reason.”
“There is no transparency in decision-making, nor any wider social discussion about whether we need such projects and what are their advantages and what are the possible consequences. A special question is – who manages these systems? How they are protected? Can be abused, etc?”
Krivokapic of SHARE Foundation said Serbia does not have the proper means of monitoring how such technology is used.
“We don’t have state bodies… that do any monitoring of the success in implementing the information system in the public sector, and in general all those tools that are procured, paid for and so on. There is no monitoring,” he said.