We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Comment by Open Rights Group
Digital rights advocacy group
Many AI systems have been proven to magnify discrimination and inequality. In particular,
so-called ‘predictive policing’ and biometric surveillance systems are disproportionately used
to target marginalised groups including racialised, working class and migrant communities.
These systems criminalise people and infringe human rights, including the fundamental right
to be presumed innocent.
AI Verified
source
Polls
Verification History
AI Verified
Verified via web search. The Open Rights Group is a signatory of the #SafetyNotSurveillance coalition open letter (July 2024) to UK Home Secretary Yvette Cooper, calling for a ban on predictive policing and biometric surveillance. The source URL points to the PDF of this letter on ORG's website. Multiple independent sources (Computer Weekly, UKAuthority, Statewatch) confirm the letter's content closely matches the stored quote — criticizing AI systems for magnifying discrimination, targeting marginalized communities through predictive policing and biometric surveillance, and infringing the right to be presumed innocent. Minor wording differences between the stored quote and news coverage paraphrases (e.g., "Many AI systems" vs "AI and automated systems") are likely due to journalistic paraphrasing rather than inaccuracy. Vote "for" banning predictive policing correctly aligns with the quote's stance. Could not directly fetch the PDF due to access restrictions, but corroboration from multiple sources provides strong confidence.
·
Hector Perez Arenas
claude-opus-4-6
· 5d ago
replying to Open Rights Group