Zest Comments On Federal Guidance For Managing Third-Party Risk In Lending
In July, the Federal Reserve Board, FDIC, and OCC invited comment on upcoming Interagency proposed guidance on managing risks associated with third-party relationships. As one of those third-parties to the banking system, Zest AI submitted our response today and it's copied here below. The tl;dr? Regulators can do a lot more to drive fairer lending outcomes in America if we just held third-party score providers to the same fair lending standards and analysis that every first-party underwriting model has to go through.
Responsible lenders already perform disparate impact testing on models that they build themselves. Somehow, third-party scores have escaped such scrutiny. Unfortunately, lenders often lack the transparency into third-party scores to perform similarly rigorous analysis or ensure that the provider has conducted such analysis themselves.
"Lenders often lack the transparency into third-party scores to perform similarly rigorous analysis or ensure that the provider has conducted such analysis themselves."
In their proposed guidance, the banking regulators have an opportunity to remind banks and score providers themselves of their obligation to ensure that third-party score providers are searching for and adopting less discriminatory alternative models. Our work has shown that, frequently, such alternative models exist and, if adopted, will drive fairer lending decisions for people of color and other historically marginalized groups.
Office of the Comptroller of the Currency
Board of Governors of the Federal Reserve System
Federal Deposit Insurance Corporation
Response to Request for Comment on Proposed Interagency Guidance on Third-Party Relationships: Risk Management
Zest AI appreciates the opportunity to respond to this Request for comment regarding proposed interagency guidance on managing risk associated with third-party relationships (the “Request”). We endorse and support the comprehensive response to this Request from the Fintech Trade Association, of which we are a founding member. The following comments and policy proposals are specific to Zest AI and speak most directly to the Request’s questions about third-party relationships focused on consumer underwriting.
Zest AI’s Mission Is to Make Lending Fair and Transparent for Everyone
Zest AI is a software company on a mission to make fair and transparent credit available to everyone. We do that by helping banks, credit unions, fintechs, and other financial institutions deploy powerful AI-driven credit models swiftly and easily. We believe that AI and machine learning (ML), when used properly, offers the United States a once-in-a-generation opportunity to reduce racial disparities in consumer financial services while at the same time improving the safety and soundness of American financial institutions.
AI/ML Have a Role to Play in Reducing Economic Disparity
Despite decades of focused effort by banks, legislators, and regulators, America’s consumer lending industry still struggles to develop predictive models that minimize disparate impact. Why? Because of flaws in the data and how the industry uses the data. Tens of millions of Americans lack sufficient credit history to compute a traditional credit score. Millions more have artificially depressed scores due to the way credit is scored today.
"Why must lenders perform fair lending analysis on models they use to underwrite loans while third-party score providers can mask what they do from the world?"
Generations of systemic racism and bias are baked into the data and traditional credit scores. And legacy players have been unable to remove it. As a result, only a fifth of Black households have credit scores over 700, compared to half of all white households. This leads lenders to deny mortgages for Black applicants at a rate 80% higher than that of white applicants. The share of Black households with a mortgage would increase nearly 11 percentage points if their credit score distribution were the same as the distribution for white households.
The responsible adoption by banks of explainable ML-powered underwriting models for consumer loans and mortgages offers a practical way to break the cycle of credit discrimination in America. (We explored this potential in greater detail in our December 2020 comment letter to the CFPB’s Request for Information on ECOA and Reg B.) We are thrilled to have demonstrated that AI and ML can solve these problems in the real world.
For instance, last year we entered a partnership with Freddie Mac to make the dream of homeownership a reality for tens of thousands of minority borrowers in the coming years. Consumer lenders of all sizes are switching to AI/ML lending—from a $500 million (assets) credit union in Nederland, Texas to credit-card giant Discover. They’re seeing sharp gains in fairness and accuracy and improved compliance. One Zest client saw approval rates for women jump by 20% after using a Zest AI-powered model. Another generated a model that shrank the approval rate gap between Black and white borrowers by 50%.
Legacy Credit Score Providers Have Been Immune From Basic Diligence Requirements
Yet, most financial institutions still approve or deny loans using general industry scores provided by third parties. These scores have significant blindspots and are known to reflect and perpetuate racial disparities. Third-party diligence should include requirements to perform thorough and transparent fair lending analysis, even on third-party scores. Too long have these legacy score providers been immune from scrutiny and exempt from basic requirements that apply to everyone else.
Why must lenders perform fair lending analysis on models they use to underwrite loans while third-party score providers can mask what they do from the world? Indeed, how is it that such fair lending analysis can be deemed complete if lenders do not conduct or are not provided with detailed fair lending analytics on the third-party scores on which they base their decisions?
Whether through reference in the proposed guidance, more detailed discussion in the FAQs, or related guidance, such as the recent FDIC and Federal Reserve papers on community bank-fintech partnerships, regulators can and should promote partnerships focused on consumer lending compliance, including transparent and rigorous fair lending analyses.
Recommendations on the Use of Guidance To Advance Regulatory Objectives
Lenders should be required to perform diligence related to the fair lending compliance of third-party scores that they rely upon to make lending decisions, particularly whether the third-party scores are arrived at using the least discriminatory means to achieve the business objective of accurate risk prediction.
One of the most effective ways to close existing wealth and homeownership disparities is to ensure lenders are not using discriminatory credit models to make decisions. Under “disparate impact” discrimination analysis, an ECOA violation may exist if a creditor uses a model that unnecessarily causes discriminatory effects. Thus, if a model causes disparities and a “less discriminatory alternative” model exists, the lender must adopt the alternative model.
Somehow, third-party scores have escaped scrutiny. Responsible lenders already perform disparate impact testing on models that they build themselves. Unfortunately, they often lack the transparency into third-party scores to perform similarly rigorous analysis or ensure that the provider has conducted such analysis themselves.
In this proposed guidance, the banking regulators have an opportunity to remind banks and score providers themselves of their obligation to ensure that third-party score providers are searching for and adopting less discriminatory alternative models. Our work has shown that, frequently, such alternative models exist and, if adopted, will drive fairer lending decisions for people of color and other historically marginalized groups.
Calls for safety, risk mitigation, and equity in the financial systems are stronger than ever. The status quo is not going to produce the changes needed for a fairer economy. Fortunately, the advent of AI/ML modeling and the potential of related bank partnerships has given us the tools we need to drive a safer and fairer financial system. Whether through specific inclusion in the proposed guidance, its FAQs, or related publications, the banking regulators have an opportunity to encourage more from these types of partnerships to the benefit of consumers, compliance, and fairness.
Theodore R. Flo, General Counsel & Head of Gov. Relations
Jay Budzik, Chief Technology Officer