Regulation
09/05/2025

AI, Executive Order Raise New Questions for Fair Lending Compliance

Regulators have removed disparate impact from their examination manuals, but banks should think long term about their compliance approach.

Laura Alix
Director of Research

Federal banking regulators recently deprioritized disparate impact in their examinations for fair lending compliance, but disregarding it altogether could be a long-term mistake for banks.

The Office of the Comptroller of the Currency announced in July that it would no longer consider disparate impact when examining banks for compliance with fair lending laws such as the Fair Housing Act and the Equal Credit Opportunity Act. In August, the Federal Deposit Insurance Corp. announced it would only look at evidence of disparate treatment when examining claims of discrimination in credit and housing. Both agencies removed references to disparate impact from their examination manuals. The Federal Reserve has not released a similar statement.

Unlike overt discrimination, where a lender explicitly offers or withholds a loan based on a borrower’s race, gender or other protected status, disparate impact can be tougher to prove. Disparate impact refers to a negative outcome for a protected group of people resulting from a policy or process that is neutral on its face and lacks a legitimate business purpose. Credit scores, as an example, may have a disparate impact on certain protected groups, but they are also highly predictive in their application for underwriting purposes and therefore considered justified, says Leslie Sowers, a partner with the law firm Husch Blackwell.

The changes from the FDIC and OCC follow an April executive order from President Trump purporting to do away with disparate impact liability across the federal government. This could mean some short-term reprieve on regulatory examinations but in the long term, banks will still need to contend with state regulators as well as the potential for reputational harm and retrospective reviews in future regulatory examinations.

“When it comes to fair lending, [banks are] realizing that this administration may not come after us, but there are states that will — and there will be another administration one day,” says Richard Andreano Jr., a partner who leads the mortgage banking group at the law firm Ballard Spahr. By and large, banks are generally staying the course and sticking to their fair lending and credit underwriting policies, he adds.

In recent years, lenders have applied machine learning technologies to analyze and audit their lending portfolios and underwriting processes for disparate impact liability. Under the Biden administration, that created new vulnerabilities to claims of disparate impact as banks had to show that underwriting models using machine learning were not repeating societal biases, Sowers says. “Across the board, we saw examiners questioning whether [banks] were affirmatively testing in a variety of areas to see what the outcomes actually were of any AI underwriting model or credit determination risk models they were using in their systems,” she says. “If they were not doing so, they were highly recommending doing that and suggesting that any time an outcome was different there needed to be a justification for it.” If testing demonstrated disparate impact for some protected group, then the lender would try to find out why that was happening and whether that variable had a legitimate business reason for being part of the underwriting process, or if it could be mitigated in some way. The point of testing isn’t to reverse engineer equal outcomes, however.

Some banks may choose to scale back the rigor of their monitoring and testing, but most will likely continue those practices, says Mark Wuchte, a principal who leads the financial services risk advisory team at Baker Tilly. “There are plenty of states that are not going to go anywhere with this,” he says. “There are still other agencies that are very much in tune with this and will continue to focus on pushing these efforts down to their banks.”

In July, for example, student loan company Earnest Operations paid $2.5 million to settle claims of disparate impact and other fair lending violations brought by the Massachusetts Attorney General, which argued that Earnest did not sufficiently mitigate the disparate impact of the artificial intelligence models it used in lending decisions. The state zeroed in on its use of the average rate of loan defaults associated with a specific school; this resulted in a disparate impact on Black and Hispanic students, the attorney general said. Earnest denied the allegations and any violations of state or federal laws. “A reputable third party reviewed our underwriting and found no evidence of these allegations,” a spokeswoman told Bank Director.

New York, California and Illinois are among the states that tend to be more assertive in enforcing fair lending laws, Sowers says.

Banks that let up too much on testing and monitoring for disparate impact could also expose themselves to private litigation or reputational damage, Wuchte says. A 2015 Supreme Court ruling affirmed that disparate impact is a recognized legal basis for bringing claims under the Fair Housing Act, although it also said the plaintiff has to clearly identify a specific policy or process as the cause. No matter the result of a lawsuit, that also opens the bank up to reputational harm. “Regardless of what the OCC guidance is, perception matters, and that will still carry a lot of weight,” says Wuchte. “Banks that stop testing expose themselves to reputation risk.”

In the longer term, the industry would benefit from guidance or standards governing the use of machine learning models in credit underwriting, Andreano says. “When we move into a different administration that is interested in fair lending, industry advocates would do well to work together on this and come up with some reasonable guidelines that banks can follow,” he says. “The people writing the ones and zeros need guidance in this area.”

Bank examiners will also have a lookback period further down the road, after a new presidential administration has taken charge and regulatory priorities have again shifted. Sowers says, “There’s a lot of other potential liability for not doing this type of monitoring and controls.”

WRITTEN BY

Laura Alix

Director of Research

Laura Alix is the Director of Research at Bank Director, where she collaborates on strategic research for bank directors and senior executives, including Bank Director’s annual surveys. She also writes for BankDirector.com and edits online video content. Laura is particularly interested in workforce management and retention strategies, environmental, social and governance issues, and fraud. She has previously covered national and regional banks for American Banker and community banks and credit unions for Banker & Tradesman. Based in Boston, she has a bachelor’s degree from the University of Connecticut and a master’s degree from CUNY Brooklyn College.