Data Science & AI

Why transparent AI is now more important than ever

Zest AI team
April 3, 2020

There’s no question that in the next six months to a year, the world is going to look radically different — especially for the financial industry.

Lenders, in particular, are going to have to re-evaluate the metrics they use for giving loans and as they make those changes, transparency is going to be essential. Consumers, shareholders, and regulators are going to demand it.

Why was one person given a loan while another was denied? Why did one borrower get a payment freeze while another didn’t? In an economy of skyrocketing unemployment, how are lenders deciding who is a good risk? What measures are lenders taking to limit their exposure to non-performing loans?

For banks that have adopted artificial intelligence (AI) to improve their lending practices, transparency means being able to determine how and why an algorithm arrived at its decision. And since AI’s mathematics are so complex, it’s not straightforward — even the best mathematicians would take weeks to unwind some AI results using a pencil and calculator.

That’s why it’s crucial that algorithmic models not be black boxes churning out results. They need to have transparency built in from the very start. 

Algorithms, like humans, are susceptible to bias. In order to scrub the algorithms for systematic bias, lenders need to be able to look inside models and see how they are making connections between thousands of different variables. Without transparency, it’s impossible to use AI in credit underwriting in a way that’s responsible both to business goals and fairness requirements. 

Until recently, the credit underwriting market didn’t have the tools to open up AI’s black box. One danger is that some systems that claim to solve the black box problem are using proofing approaches that work fine for traditionally constructed models, but actually produce an unacceptably high percentage of bad explanations. The false positives from transparency tools not built for AI and ML approaches mean lenders take on more portfolio and regulatory risk than they believe. 

At Zest AI, we’ve spent years refining our AI systems to include explainability tools at the core of every model. The results are solid: We’ve helped lenders expand access to credit for underserved populations, with an average 15 percent increase in approval rates with no additional risk. Providing the ability to understand a model's reasoning and economic value allows lenders to make credit decisions with confidence while ensuring compliance with regulations around disparate impact and adverse action. 

Without transparent AI in lending models, millions of deserving people could find it nearly impossible to get affordable credit to buy a home, finance a car, or take out a student loan. Regulators will notice when lenders are using responsible, transparent AI and the market will, too.

Photo by Luca Nicoletti on Unsplash

latest
March 28, 2024
Innovation In Lending
Looking beyond market pain points to find purpose
March 25, 2024
Women’s History Month: sisters are doin' it for themselves
February 27, 2024
Zest Cares — we mean it when we say it!