Is Minority Report Becoming a Reality? Groundbreaking Experiment in the United Kingdom Forecasts Future Criminals.
Technology has revolutionized various aspects of our lives, including security. But what happens when we utilize it not only to react to crime but to predict it beforehand? An ongoing initiative in the United Kingdom has raised concerns: a system developed to forecast who is most likely to commit murder. Despite sounding like something out of Minority Report, this is already a reality.
The project with a discreet name

Originally known as the Homicide Prediction Project, it was later renamed to Sharing Data to Improve Risk Assessment. Despite the name change, the intention remains clear: to create a system capable of identifying individuals who, based on certain data, might potentially commit a crime.
Documents obtained by the transparency organization Statewatch through freedom of information requests revealed that the British Ministry of Justice is collaborating on this system with the support of the Home Office, the Metropolitan Police of London, and the Greater Manchester Police. The system not only includes information about convicted criminals but also about unconvicted suspects, victims, witnesses, and even missing persons.
The project has gathered data from between 100,000 and 500,000 individuals, incorporating highly sensitive information: mental health history, addictions, self-harm, suicide attempts, disability conditions, and other “health markers” that, according to the Ministry, hold “significant predictive value.”
Historical context and potential consequences

This is not the first time the United Kingdom has experimented with predictive justice. The Offender Assessment System, currently in use, aims to predict the likelihood of reoffending and is utilized by judges for sentencing. However, government evaluations have shown that its predictions for non-violent crimes are often inflated and less accurate for black individuals.
This bias is not exclusive to the UK. Predictive justice systems globally have exhibited similar flaws as they rely on historically biased data. The overrepresentation of marginalized communities in police records — a consequence of decades of — is integrated into the algorithms, perpetuating a cycle of digital discrimination.
Beyond the promise of “preventing violence before it occurs,” the potential risk of implementing such a system is evident: judging individuals not for their actions but for what an algorithm predicts they might do.
The overlooked caution
The film Minority Report forewarned us of the perils of a justice system founded on prediction. Ironically, this caution now seems to serve as a guide. Instead of dismissing the notion of penalizing thought or probability, we are heading directly towards it.
The question is no longer whether we can build these systems, but whether we should. And, most importantly, who controls them and who will be subjected to their use? The United Kingdom has taken the initial step, but what lies ahead remains uncertain.
This article has been translated from Gizmodo US by Romina Fabbretti. You can find the original version here.
