We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Director, Center for Data Innovation
The third provision requires organizations to have third parties conduct annual audits of their algorithmic decision-making, including to look for disparate-impact risks, and create and retain an audit trail for at least five years that documents each type of algorithmic decision-making process, the data used in that process, the data used to train the algorithm, and any test results from evaluating the algorithm as well as the methodology used to test it.
Organizations must also provide a detailed report of this information to the DC attorney general’s office. This provision places an enormous and burdensome auditing responsibility not only on those organizations using algorithms for decision-making, but also on service providers who may offer such functionality to others. Many of the auditing requirements would be inappropriate to require service providers to report since they will not necessarily have details about how a particular customer uses their service. Moreover, many businesses and service providers are struggling to comply with the algorithm auditing requirements in New York City, which only apply to AI systems used in hiring. The audit requirements in the proposed Act would apply to a much broader set of activities and present even more challenges.
(2022)
source
Unverified
Polls
replying to Daniel Castro