CFPB issues guidance on adverse credit decisions made by artificial intelligence

Bureau reminds lenders of legal requirement to give specific reasons for credit denials

CFPB issues guidance on adverse credit decisions made by artificial intelligence

Bureau reminds lenders of legal requirement to give specific reasons for credit denials

Acknowledging the larger role that artificial intelligence (AI) is playing within modern underwriting processes, the Consumer Financial Protection Bureau (CFPB) is reminding lenders of the legal need to give “accurate and specific reasons” when denying applicants credit.

This includes mortgage credit and instances when lenders use advanced algorithms and personal consumer data to evaluate an applicant’s creditworthiness. Specifically, per the CFPB, the Equal Credit Opportunity Act requires creditors to explain in detail their reasons for adverse decisions, including credit denials, even for adverse decisions made by AI.

“Technology marketed as artificial intelligence is expanding the data used for lending decisions and also growing the list of potential reasons for why credit is denied,” CFPB director Rohit Chopra said. “Creditors must be able to specifically explain their reasons for denial. There is no special exemption for artificial intelligence.”

Complex algorithms marketed as AI are often fed by creditors with large datasets. Some of these may be harvested from consumer surveillance or may not, at first pass, be considered by the borrower as particularly relevant to their finances, the CFPB noted. Because of this, the list of reasons for a lender’s adverse credit actions may be expanding — but some creditors are inappropriately and illegally relying on checklists provided in CFPB sample forms in supplying reasons for credit denials. These sample checklists, while helpful, should not be considered as an exhaustive list of reasons for adverse actions and shouldn’t be used as check-the-box, one-size-fits-all documents, the bureau stated.

“Creditors that simply select the closest factors from the checklist of sample reasons are not in compliance with the law if those reasons do not sufficiently reflect the actual reason for the action taken,” according to a statement released by the CFPB. “Creditors must disclose the specific reasons, even if consumers may be surprised, upset or angered to learn their credit applications were being graded on data that may not intuitively relate to their finances.”

The guidance is the newest move by the CFPB to address the growing influence of technology on fair lending. Late last year, the bureau reminded landlords of the issues regarding reliance on digital algorithmic scoring of prospective tenants. More recently, the bureau has been pursuing input regarding proposed standards for automated valuation models, out of concern that unregulated black-box models may lead to digital redlining.

Author

More Headlines