Error!
You must log in to access this page.
We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Consumer Financial Protection Bureau (CFPB)
ai (1)
ai-ethics (1)
ai-governance (1)
ai-policy (1)
ai-regulation (1)
ai-risk (1)
ai-safety (1)
transparency (1)
trust-in-ai (1)
Top
New
-
Should AI systems above a certain capability threshold be required to have interpretable decision-making processes?
Consumer Financial Protection Bureau (CFPB) agrees and says:
Today, the Consumer Financial Protection Bureau (CFPB) confirmed that federal anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms. The CFPB published a Consumer Financial Protection Circular to remind the public, including those responsible for enforcing federal consumer financial protection law, of creditors’ adverse action notice requirements under the Equal Credit Opportunity Act (ECOA). “Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said CFPB Director Rohit Chopra. “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.” ECOA protects individuals and businesses against discrimination when seeking, applying for, and using credit. To help ensure a creditor does not discriminate, ECOA requires that a creditor provide a notice when it takes an adverse action against an applicant, which must contain the specific and accurate reasons for that adverse action. Creditors cannot lawfully use technologies in their decision-making processes if using them means that they are unable to provide these required explanations. (2022) source Unverified