A group of fintech companies and the National Community Reinvestment Coalition (NCRC) are calling on the federal government to use artificial intelligence (AI) to help detect and eliminate discrimination in lending.
The NCRC and a group of financial technology firms, including Zest AI, Upstart, Stratyfy and FairPlay, signed the letter urging the Federal Housing Finance Agency (FHFA) and the Consumer Financial protection Bureau (CFPB) to create clear guidelines to lenders on how using new AI fair lending tools could improve the evaluation process and eliminate disparities.
The coalition commended the CFPB and FHFA in the letter for their recent approval of a new rule that addresses the use of AI in automated valuation models. They also commended the FHFA as the first federal agency to publish AI-specific guidance for Fannie Mae and Freddie Mac in 2022.
However, they warn that AI is advancing more rapidly than the government can develop regulations to oversee its use. This situation could lead to machines introducing new forms of discrimination, inequity and bias into a variety of areas, including the mortgage process and business loans. Regulators can help the process by taking quick action to make it clear that lenders can use AI tools to help comply with fair lending laws.
In the letter, the group wrote that they are focusing on machine learning and deep learning categories of AI, rather than generative AI, “which does not have known debiasing applications in fair lending at this time.”
“AI and other improvements in data technology can enable fair lending testing to be more efficient and effective than in the past,” the coalition wrote to FHFA and CFPB. “In particular, these tools can be used to build more fair and inclusive credit models in addition to conducting robust and efficient searches for less discriminatory alternatives as required under the Equal Credit Opportunity Act and Fair Housing Act.”
The coalition claims that by continuing to follow the current way of analyzing consumer qualifications, about 20% of American consumers, some 45 million people, either are “not visible” using legacy scoring methods, or cannot be scored due to insufficient information. Therefore, they cannot be eligible for credit.
They also claim that the traditional scoring system can lock people into low scores and block them from qualifying for credit. Most often this problem impacts low-income young people who live in areas with a high concentration of minorities, renters and foreign-born people. Many of these people may be more credit worthy than their credit scores suggest.
According to the letter, AI and machine learning technologies make it possible to score previously excluded consumers. The new systems also are enabling access and inclusivity, in some cases.