The predictive policing tool shows that EU lawmakers can also be targeted

Predictive policing has unmasked a new group of future criminals: MEPs.

A new testing system has highlighted five EU politicians as ‘at risk’ of committing future crimes. Luckily for them, it’s not a tool used by law enforcement, but one designed to highlight the dangers of such systems.

The project was born from an idea of Fair processes, a criminal justice watchdog. The NGO is campaigning for a ban on the provision of the police, which it uses data analytics to predict when and where crimes are likely to occur and who might commit them.

Proponents argue that the approach may be more accurate, objective and effective than traditional policing. But critics warn it instills historical bias, disproportionately targets marginalized groups, amplifies structural discrimination and violates civil rights.

Join us at the TNW conference on June 15 and 16 in Amsterdam

Get 20% off your ticket now! Limited offer.

“It might seem incredible that law enforcement and criminal justice authorities are making predictions about crime based on people’s background, class, ethnicity and associations, but this is the reality of what is happening in the EU,” said Griff Ferris, Senior Legal and Policy Officer at Fair Trials.

Indeed, the technology is increasingly popular in Europe. In Italy, for example, he analyzed an instrument known as Dalia ethnicity data profile and predict future crime. In the Netherlands, meanwhile, the so-called Top 600 list has been used to make predictions which youths will commit high-impact crimes. One out of three people on the list – many of whom have reported harassed by the police, they were found to be of Moroccan origin.

To illustrate the impacts, Fair Trials was developed a mock assessment of future criminal behavior.

Unlike many of the real systems used by the police, the analysis has been made completely transparent. The test uses a questionnaire to profile each user. The more “Yes” answers they give, the higher their risk score. You can try it yourself Here.

Politicians from the Socialists & Democrats, Renew, Greens/EFA and the Left Group were invited to test the tool. After completing the quiz, MEPs Karen Melchior, Cornelia Ernst, We have Wölken, Petar VitanovAND Patrick Breyer they were all identified as being at “medium risk” of committing future crimes.

“There should be no place in the EU for such systems: they are unreliable, biased and unfair.

The gang will face no consequences for their potential crimes. In real life, however, such systems could feed them into police databases and subject them to close monitoring, random interrogations, or arrests and searches. Their risk scores may also be shared with schools, employers, immigration agencies and child protective services. Algorithms have peers it led to people being incarcerated with little evidence.

“I grew up in a low-income neighborhood, in a poor country in Eastern Europe, and the algorithm defined me as a potential criminal”, Petar Vitanov, MEP of the Bulgarian Socialist Partyreads a note.

“There should be no place in the EU for such systems: they are unreliable, biased and unfair.”

Fair Trials released the test results amid growing calls to ban predictive policing.

The argument has proven divisive in proposals for the AI ​​Act, which is set to become the first-ever legal framework on artificial intelligence. Some lawmakers are pushing for a blanket ban on predictive policing, while others want to give law enforcement some leeway.

Fair Trials has given systems advocates a new reason to reconsider their views: Technology can also target them.