I can’t profess to have researched every particular high-risk use in detail, but many of them should be banned. Predictive policing is a really good example. Many people believe that if you de-bias predictive policing systems, they will no longer profile and lead to the over-policing of racialized and poor communities. I disagree. Because such systems are steeped in a broader context of racial inequality and class inequality, there is no way you can make a technical tweak or slightly improve the dataset such that discriminatory results will not ensue from the use of the system. And this leads me to believe that it should be banned. This is one of the areas where the bias debate can be a little bit obscuring. (2021) source Unverified
Comment X 1mo ago
Polls
replying to Sarah Chander
Terms · Privacy · Contact