Join the movement to end censorship by Big Tech. StopBitBurning.com needs donations and support.
U.K.'s "pre-crime" algorithm sparks ethical outcry amid privacy fears
By bellecarter // 2025-04-19
Mastodon
    Parler
     Gab
 
  • The U.K. Ministry of Justice is developing an algorithm to predict which individuals – some with no criminal records – might commit murder, drawing on 500,000+ datasets, including mental health histories, age of first police contact and domestic abuse experiences. The project, rebranded from "Homicide Prediction Project," remains unlaunched but has raised alarm over likenesses to dystopian "pre-crime" systems.
  • The initiative builds on tools like the Offender Assessment System (OASys) but adds new sensitive data categories such as self-harm records, disabilities and victim trauma experiences, prompting fears of bias and misuse of personal information.
  • Critics, including Statewatch, argue the algorithm risks amplifying racial disparities, given the U.K.’s history of biased policing and its reliance on data from racially skewed systems, causing disproportionate harm to minorities and marginalized groups.
  • The MoJ insists the project is experimental, aiming to improve public safety through better risk assessments for probationers. However, activists counter that framing it as "research" does not justify deploying biased tools that undermine due process and dignity, likening it to the sci-fi Minority Report premise of punishing potential future crimes.
  • As the U.S. has banned racially biased predictive policing tools, the U.K.'s project highlights tensions between crime prevention and civil liberties. The outcome may set a global precedent on whether such algorithms enforce accountability or deepen systemic discrimination.
The British government's push to implement an algorithm predicting who might commit murder mirrors dystopian sci-fi tropes while clashing with civil liberties concerns. As predictive policing gains traction globally, this project highlights urgent debates over technological overreach, racial bias and the ethics of policing communities that already face systemic discrimination. The U.K. Ministry of Justice (MoJ) is developing an algorithm to forecast which individuals convicted of crimes may escalate to homicide, according to documents uncovered by civil liberties group Statewatch. Dubbed the Homicide Prediction Project – now rebranded as "Sharing Data to Improve Risk Assessment" – the initiative aggregates data from over 500,000 individuals, some without criminal records, to identify "future criminals." Launched in 2023 and completed in 2024, the project remains unlaunched but has already sparked warnings from activists and legal experts. The system builds on tools like the Offender Assessment System (OASys), used since 2001 to assess recidivism risks for probation decisions. However, the new algorithm expands its scope to include fresh datasets, including mental health histories, age of first police contact and domestic abuse experiences.

Civil liberties concerns: Biased systems, structural discrimination

Statewatch, which exposed the project via Freedom of Information Act (FOIA) requests, condemns the initiative as inherently flawed and biased. Sofia Lyall, a Statewatch researcher, warns that algorithms trained on data from an institutionally racist police system will disproportionately harm minorities and marginalized communities. "Time and again, research shows these systems are inherently flawed," Lyall said, citing studies in the U.S. and U.K. showing predictive policing often exacerbates racial disparities. "Coding bias into automated profiling of potential criminals is deeply wrong, particularly when it uses sensitive data on mental illness and addiction." The U.K.'s legacy of biased policing looms large: Black Britons are seven times more likely to be stopped and searched than white individuals, per 2023 Home Office data. Critics argue the algorithm risks amplifying these inequities, funneling more surveillance and penalties toward communities already overrepresented in the criminal system. At the center of the debate is the inclusion of personal data from people without criminal records. Statewatch claims victims of crime – including domestic abuse survivors – are being analyzed. MoJ spokespersons deny this, stating only "convicted offenders" are included. But FOIA documents reveal that shared data includes "special categories" such as mental health markers, self-harm records and disabilities. Age at first police contact as a victim of crime is also listed as a metric, prompting fears of weaponizing trauma.

Government defense: Public safety first

The MoJ insists the project is solely experimental and aims to improve risk assessments for probationers. "This research is about enhancing public safety," a spokesperson said, emphasizing collaboration with police forces like Greater Manchester Police. The tool's supporters, including some criminologists, argue that predictive analytics could stop violent crimes by identifying high-risk cases early. Yet, Lyall counters, "Public safety is a worthy goal, but not at the cost of due process and dignity." The U.K.'s initiative harks back to Philip K. Dick's "Minority Report," where psychic "pre-cogs" flagged future criminals. The U.S. has seen similar pushback against predictive policing. In 2020, Californians banned AI tools that predict criminality due to racial bias concerns. Yet the U.K.'s project moves forward, now backed by data-sharing agreements stretching back to 2015. (Related: Investigation reveals Instagram's algorithm regularly suggests explicit content to users as young as 13 years old.) Watch the video below that talks about "predictive policing." This video is from the J. D. Gallé | neoremonstrance channel on Brighteon.com.

More related stories:

Russian researchers unveil AI model that adapts to new tasks without human input. Researchers develop algorithm that will allow robots to work together with humans … or hunt us like prey. Bill Gates wants AI algorithms to censor vaccine "misinformation" in real time.

Sources include:

TheNationalPulse.com TheGuardian.com Brighteon.com
Mastodon
    Parler
     Gab