Noone involved got informed, and (even in-house) analyses point to important flaws
Published at: 21 January 2025, 14:53 (UTC+01:00)
Last updated: 21 January 2025, 16:48 (UTC+01:00)
This info comes from an in-depth article on Follow The Money (NL, ftm.nl @ web.archive.org) by Evaline Schot and David Davidson.
The info below is translated and summarized directly from the article.
- The algorithm is called Preselect Recidive
- This is an algorithm used in the last 10 years to let a computer determine whether minors will end up doing more crime, and as such lessening or worsening
- It has been used by the Dutch police, the Openbaar Ministerie (public ministry), Raad voor de Kinderbescherming (council for child protection), Jeugdreclassering (youth reclassification) and youth prisons
- It uses factors like age, gender, past involvement in cases (even as witness or victim) and housemates' involvement in cases
- In 2018 alone, 61.000 minors were assigned a score by this algorithm
- It has been trained using the data of 5000 minors that were at any point involved with the police
- This data barely included edge cases like 12 year olds, girls, and people who haven't had contact with police before
- The consequences of the algorithm are as follows:
- Minors get algorithmically assigned a score after getting written up by police
- This score will influence or even determine whether you get Halt (a no-criminal-record study/work trajectory) or whether you get a strafblad/criminal record (and potentially get further investigation/action by OM/public ministry)
- That criminal record limits your ability to get Verklaring Omtrent het Gedrag, which is required for certain educations & jobs
- While noone knows about it:
- The minors, parents and lawyers were all not made aware of the algorithm, the score or the potential consequences
- It is not registered on the Dutch governments algorithm register (NL, algoritmes.overheid.nl)
- There's no precedent for it working:
- FTM asked a technology researcher/mathematician from Bits of Freedom to analyse the design of the system. He noted that false positives were common, and random/small variations can be the difference between a high/low score
- FTM asked 4 scientists from Radboud University (with expertise in computer sciences, algorithms or the legal system) to review the findings. They concluded it was a limited algorithm that can have risky consequences. Djoerd Hiemstra (professor data sciences) compared it to a dice roll
- Frederik Zuiderveen Borgesius high teacher ICT & law, mentions that only written-up/recorded crime is weighted; and as such it can not represent the entirety of crime, whilst still being heavily influenced by past police's actions/decisions on specific types of minors
- Internal investigation by ministerie van Justitie en Veiligheid (ministry of justice & safety) also concluded the algorithm is lacking
- Previous attempts to do familiar things that never worked:
- 2022: Algemene Rekenkamer concluded the Criminaliteit Anticipatie Systeem lacked even basic requirements
- 2013-: A politico article about the "ruined thousands of lives" of the Dutch anti-fraud algorithm (EN, politico.eu)
- 2015-2023: An Erasmus University Rotterdam article reporting on the Police no longer using the violence prediction algorithm after it turned out to not be useful, and even automatically assigning high scores to people of Moroccan, Somalian and Antillean origins (NL, eur.nl)
Authors note: I use the word "minors" above, but the affected people do still include 18 year olds. However, this age seems to be the upper limit, and these people are often still in high school. That aside, the largest percentage of people affected are full on minors. While "youth" might work as a replacement for "minors" for some, it is not used both due to its association with adolescence and to avoid understating the severity of this situation.