Error!
You must log in to access this page.
We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Global criminal justice watchdog NGO
Today, Fair Trials has launched an interactive online tool designed to show how the police and other criminal justice authorities are using predictive systems to profile people and areas as criminal, even before alleged crimes have occurred. Our research has shown that more and more police forces and criminal justice authorities across Europe are using automated and data-driven systems, including artificial intelligence (AI), to profile people and try and ‘predict’ their ‘risk’ of committing a crime in the future, as well as profile areas to ‘predict’ whether crime will occur there in future. There is growing opposition to predictive policing and justice systems across Europe, with many organisations and some Members of the European Parliament (MEPs) supporting a ban.
Fair Trials is calling on MEPs to ban predictive systems when they vote on the Artificial Intelligence Act in the upcoming months. We hope that our example predictive policing and justice tool will raise awareness of the discriminatory outcomes generated by these systems. Using information about someone’s school attendance, family circumstances, ethnicity and finances to decide if they could be a criminal is fundamentally discriminatory. The only way to protect people and their rights across Europe is to ban these systems.
(2023)
source
Unverified
Polls
replying to Fair Trials