Saturday, June 21, 2025

British Police Test AI System to Profile Individuals Using Sensitive Data From 80 Sources

British Police Test AI System to Profile Individuals Using Sensitive Data From 80 Sources

By Ken Macon

Posted on June 21, 2025




British police forces have begun acquiring AI software from a US tech company that merges sensitive personal data, such as race, health, political views, religious beliefs, sexuality, and union membership, into a unified intelligence platform.

A leaked internal memo from Bedfordshire Police obtained through freedom of information, reveals plans to roll out the “Nectar” system beyond its pilot stage.

Developed in partnership with Palantir Technologies, Nectar draws together approximately 80 data streams, from traffic cameras to intelligence files, into a single platform. Its stated aim is to generate in-depth profiles of suspects and to support investigations involving victims, witnesses, and vulnerable groups, including minors.

The 34-page briefing highlights police leadership hoping to extend the software’s deployment from Bedfordshire and the Eastern Region Serious Organised Crime Unit to a national scale, Liberty reported. It asserts the system could enhance crime prevention efforts and protect at-risk individuals more effectively.

Official Data Protection Impact Assessment (DPIA) document for Palantir Foundry Platform (Nectar) Beds force, detailing the project's goal to help multiple police units and eventually apply it nationally to protect vulnerable people by preventing, detecting, and investigating crime; it lists special category data used such as race, political opinions, religion, genetic data, sexual orientation, philosophical beliefs, ethnic origin, sex life, trade union membership, biometric data, and health; data subjects involved include persons suspected or convicted of criminal offences, victims, witnesses, children or vulnerable individuals, and employees.

This move forms part of a broader governmental initiative to apply artificial intelligence across public services, including health and defense, often via private sector partnerships such as this.

However, the deployment of Nectar, which accesses eleven “special category” data types, has raised alarms among privacy advocates and some lawmakers. These categories include race, sexual orientation, political opinions, and trade union membership.

While Palantir and Bedfordshire Police emphasize that Nectar only utilizes information already held within existing law enforcement databases and remains inaccessible to non-Police personnel, concerns are mounting. There are worries about potential misuse, such as data retention without proper deletion processes, and the risk that innocent individuals could be flagged by algorithms designed to identify criminal networks.

Checklist showing selected options for special category data to be used in the proposal, including Race, Ethnic origin, Political opinions, Sex life, Religion, Trade union membership, Genetic Data, Biometric Data, Sexual orientation, and Health, with Philosophical beliefs and None not selected.

Former Shadow Home Secretary David Davis voiced alarm to the I Magazine, calling for parliamentary scrutiny and warning that “zero oversight” might lead to the police “appropriating the powers they want.”

Liberty and other campaigners have also questioned whether Nectar effectively constitutes a mass surveillance tool, capable of assembling detailed “360-degree” profiles on individuals.

In response, a Bedfordshire Police spokesperson stated the initiative is an “explorative exercise” focused on lawfully sourced, securely handled data.

They argue the system accelerates case processing and supports interventions in abuse or exploitation, especially among children. Palantir added that within the first eight days of deployment, Nectar helped identify over 120 young people potentially at risk and facilitated the application of Clare’s Law notifications.

Palantir, which built Nectar using its Foundry data platform, insists its software does not introduce predictive policing or racial profiling and does not add data beyond what police already collect. The firm maintains that its role is confined to data organization, not decision-making.

Still, experts express deep unease.

Although national rollout has not yet been authorized, the Home Office confirms that results from the pilot will inform future decisions. With private-sector AI tools embedded more deeply into policing, questions about oversight, transparency, data deletion, and individual rights loom ever larger.

Ken Macon



Compiled by http://violetflame.biz.ly from: 
Reminder discernment is recommended
from the heart, not from the mind
 
The Truth Within Us, Will Set Us Free. We Are ONE.
No Need of Dogmatic Religions, Political Parties, and Dogmatic Science, linked to a Dark Cabal that Divides to Reign.
Any investigation of a Genuine TRUTH will confirm IT. 
TRUTH need no protection.
 
Question: Why the (fanatics) Zionists are so afraid of any Holocaust investigations?
 

  


EN/PT http://violetflame.biz.ly/cgi-bin/blog

PT

Social Media:

 
Google deleted my former blogs rayviolet.blogspot.com & 
rayviolet2.blogspot.com just 10 hrs after I post Benjamin Fulford's
February 6, 2023 report,
 accusing me of posting child pornography.
(A Big Fat Lie) Also rayviolet11.blogspot.com on Sep/13 2024; released early in 2025

 
Visitor MapesoterismoFree counters!
 


 

No comments:

Post a Comment