FOUNDERS & FRIENDS PODCAST

With Scott Orn

A Startup Podcast by Kruze Consulting

Subscribe on:

Scott Orn

Scott Orn, CFA

Kareem Saleh, co-founder of FairPlay AI, discusses the origin of the company and how it works to address systemic bias in lending algorithms

Posted on: 02/15/2022

Kareem Saleh

Kareem Saleh

Founder & CEO - FairPlay AI


Kareem Saleh of FairPlay AI - Podcast Summary

Kareem Saleh of FairPlay AI talks about how the company is working with lending institutions to improve and advance their lending practices to reduce systemic bias.

Kareem Saleh of FairPlay AI - Podcast Transcript

Scott: Hey, it’s Scott Orn, Kruze Consulting. And thanks for joining us on Founders and Friends for another awesome podcast. Let’s give a quick shout out to the Kruze Consulting accounting team. We’re very fortunate. We have a ton of people at Kruze who work on the monthly books for our clients, and get them all set up, due diligence ready, rocking every month, answering all the clients’ questions, making all those adjustments. And there’s no better moment for a founder, and for us really, when a founder says, “Hey, I think I’m going to get a term sheet. Are my books ready for diligence?” And we get to say, “Yes, they are. Fire away. Send them over. Give them access.” That is a great feeling. It’s the feeling that lets us know we’ve done a job very well done. And nothing is better than watching that cash at the bank account. So, if you are a venture-backed startup, you’re going out to fundraise, maybe check us out. Check us out at kruzeconsulting.com. We love what we do. At taping here, I think we have 575 clients. Clients raised over $1 billion this year, so we know what we’re doing, and hopefully we can help you be successful in your fundraise. All right, let’s get to the podcast. Thanks.
Singer: (Singing). It’s Kruze Consulting. Founders and Friends with your host, Scotty Orn.
Scott: Welcome to Founders and Friends podcast with Scott Orn at Kruze Consulting. And today, my very special guest is Kareem Saleh of FairPlay AI. Welcome Kareem.
Kareem: Thanks, Scott. Great to be here.
Scott: We are longtime friends or-
Kareem: Friends and collaborators, yeah.
Scott: Collaborators from the ZestFinance days. And, you have built a super cool company. We also have the pleasure of your wonderful partner in life, Melissa, working with Kruze for a while. So, I feel like we are living parallel lives here. And maybe you can start off the podcast just by telling everyone how you had the idea for FairPlay, and how you got going.
Kareem: Yeah. So, in the winter of 1977, my parents arrived in Chicago. They were immigrants, fresh off the boat, from North Africa. And, like so many immigrants, they dreamt of starting a new life in America, and wanted to start a small business. They had graduate degrees, they spoke fluent English, and yet no bank would lend to them. So, they really only had one choice, which was to go to work, and try to save up the money. My mom ended up taking a job, working at a chemical processing plant, and it turned out that they had lacked safety practices there. And, one day, there was an accident. And I will never forget, being four years old, and seeing my mom crying in a hospital bed. And just wanting to cheer her up, and so, in my youthful exuberance, I was, “Mom, when you get better, maybe we could go celebrate by going to Great America.” And I will never forget the pit in my stomach, even as a four-year-old, when she looked at me and said, “I can’t go to Great America, I’m going back to the toxic chemical plant so we can save the money to start this business.”
Scott: It’s real life, real entrepreneurship, real supporting your family. That’s a real-
Kareem: It was at that moment that I learned what it feels like to see someone you love being treated unfairly. And as I grew older, I saw that in many aspects of American life, opportunity is not distributed fairly. And, when folks don’t get fair access to stuff like credit and healthcare it impacts families for generations. And so, that experience set me on a path of increasing access to credit for people who had been shut out of the system. And it got me asking, “How can we do a better job of underwriting inherently hard to score borrowers. Thin files, no files, recently arrived immigrants who don’t resemble the traditional conception of what a quote-unquote good applicant looks like.” And so, that work took me across the globe to Sub-Saharan Africa, and Latin America, and Eastern Europe, and the Caribbean. I spent a few years at an unfortunately named mobile wallet startup called Isis that was mercifully rebranded Softcard and sold to Google. And then I spent several years in the Obama administration at the State Department, and the Overseas Private Investment Corporation, before landing in Silicon Valley at venture-backed startups that were tackling these problems of financial inclusion.
Scott: That’s Incredible. What happened? You got to the point where you’re, “I need to take this into my own hands and just start a whole new company to do this.” The vision was that clear to you?
Kareem: Well, it was really two things. The first was, in the course of that work, I got visibility into the underwriting practices of the most prestigious financial institutions in the world. And, what I was quite surprised to find, Scott, is that even at the commanding heights of American finance, global finance actually, the underwriting methodologies were extremely rudimentary, certainly by Silicon Valley standards. And all of the decisioning systems being used for underwriting, for pricing, for marketing, for collections, exhibited disparities against people of color, women, and folks from other historically disadvantaged groups. And, part of the reason that really shocked me is that, ostensibly coming out of the civil rights era, we have laws that prohibit that kind of discrimination. And so, it really got me wondering, “How is it possible on the one hand that bias in lending is illegal and yet it is ubiquitous?” And when I started surveying the landscape, I found that the way that that is made possible is through something I jokingly refer to as the fair lending industrial complex, which is fancy consultancies, and law firms that financial institutions rely on to try to justify their discrimination rather than do anything to actually make their lending practices any fairer.
Scott: It’s almost like the tobacco companies deciding it’s better to pay lawyers than to actually change the product like that. It’s that extreme, right? That’s how it works.
Kareem: And just like smoking, not only is that morally outrageous, but it’s also substantively unnecessary, right? And so, I’ve been toying with this idea for a while. And then it was really in June of 2020 when people took to the streets to protest the murder of George Floyd. And as the Black Lives Matter movement was sweeping across the country it seemed to me that this issued of systemic bias just became too deafening to ignore.
Scott: Yeah. Especially if you’re a person who’s so talented. It’s almost like feeling the call, right? You know, in your heart, that you’re someone who can actually put a dent in this. Did you feel like you had an obligation to do something?
Kareem: Well, I felt like I had an obligation, and then I also had the very good fortune of having, over the course of my career, worked with some fantastic AI technologists. Yeah. And so, I think many people, in June of 2020, were asking themselves, “What part, what role can we play to address systemic bias?” And, as I started looking at that, and looking at my network, I started spending more time with my co-founder, who had been a long time friend and collaborator like you, Scott. A fellow named John Merrill. And John had built AI systems at Google and at Microsoft. And, we began experimenting with ways of harnessing AI that could drive more positive outcomes for historically disadvantaged groups. And so, we started building MVP type AI underwriting products, and that put us on a path to doing the research and development that eventually became FairPlay. And our thesis was that left to their own devices, algorithms that make high stakes decisions about people’s lives will replicate, and possibly, even worsen the discrimination of the past.
Scott: It’s borrowing an analogy is it like how when people watch certain videos on YouTube and YouTube’s algorithm, not to pick on YouTube, this is how a lot of algorithms work, it starts showing them the progressively more extreme versions of those videos, and it pushes them down the black hole? That’s what the poor financial algorithms can do, right? You put someone in a materially worse situation over and over again, and they end up in a much worse situation. Is that accurate?
Kareem: It is exactly the right analogy, Scott. In fact, one of the conclusions that we came to, as we started doing this work, is that a key part of the problem is that algorithms are traditionally given a single objective, and they relentlessly maximize that objective, right? So, you see this most clearly with Facebook. The objectives given to the Facebook algorithm is to maximize your engagement. And so, that algorithm is going to do that, right? It’s going to seek to maximize your engagement even if the content it’s showing you to keep you engaged is bad for your health, or bad for society. And, as we looked around, we took a look at Tesla. And we were, “Tesla has a very interesting solution to this problem.” Right? Because you could imagine if Elon programmed the neural network that powers the self-driving call cars to just get you from point A to point B, the car might do that while driving the wrong way down a one-way street, while blowing through red lights, while endangering pedestrians. What Elon did that was so clever was to give his algorithms two targets, right? Get the passenger from point A to point B while also respecting the rules of the road. And so, we looked at the Tesla example and we were, “Why can’t we do this in financial services? Why can’t we give the algorithm an objective of predicting default accurately while also minimizing disparities between groups?”
Scott: It’s really amazing. You said you were seeing the underwriting standards or the approaches at the quote-unquote best institutions of American credit, and things like that. But really, mostly what those people care about, or the institutions care about is actually just making money. You hope, right? And so, it sounds like, just our previous conversations, FairPlay can actually make them more money too. It’s not just a pick A or B. Those two objectives you’re setting comes with the benefit of a third objective, or a third conclusion, which is, “Hey, we’re actually going to make more money for you. You should embrace this.”
Kareem: Oh, that’s exactly right. So, what we do is we go to financial institutions and we say, “Hey, you have an algorithm in place? You’re happy with your algorithm? You’ve invested a lot of time and effort and prestige into deploying that algorithm. Fair enough. But all of the folks that your algorithm declined, send them to our algorithm that’s been tuned with AI fairness methodologies, and we’re going to take a second look at the folks that your algorithm rejected to see if there are dimensions on which those individuals resemble other good applicants you approved.” And what we find, Scott, is that 25 to 33% of the time, the highest scoring black, brown, and female applicants that get declined perform as well as the riskiest white man that get approved.
Scott: 25 to 33%, that’s crazy. That’s a lot of people.
Kareem: Yep.
Scott: And I can totally visualize those conversations. There’s someone at the bank, or credit union who’s responsible for building these algorithm, and I’m sure they have these moments of self conflict, right? They’re, “Well, I’m super proud of what I built.” But then when you show them the data… What’s that conversation like? Are they quick to realize what’s happening? Or do they fight it? What’s it like?
Kareem: Yeah. Well, I think nobody who builds these algorithms, that I’ve encountered, is a person of bad faith. What they’re running up against are limitations in data and mathematics. And for many years, really, since the laws that were passed in the wake of the civil rights era, we have tried to achieve fairness in lending through blindness. We don’t look at whether a person is a female, or what racial background, or ethnic background they’re from. We try to isolate variables that seem to be objective or neutral, and in that way, we try to achieve fairness through blindness. But, all you got to do is look at the black home ownership rate and see that it is at the same dismally, low levels, that it was at the time of the passage of the Fair Housing Act to see that trying to achieve fairness through blindness hasn’t worked. Now, what’s different about some of these new AI fairness techniques is that they actually take consciousness of your demographic membership into account. And let me give you one example of how that can be helpful. So, one of the variables that we see all the time in credit underwriting models is consistency of employment. How consistently are you employed? And you look at a variable like that and you’re, “Well, that has a reasonable relationship to credit worthiness. I mean, after all, if you’re inconsistently employed, how are you ever going to pay back a loan?” At the same time, consistency of employment is always going to discriminate against women between the ages of 20 to 45, who take time out of the workforce to start a family. So, maybe, you want to tell your algorithm, “Hey, there’s a population out there who will exhibit somewhat inconsistent employment. Don’t necessarily hold that against them. Look to see if there are other dimensions on which they resemble good applicants that you approved.” And lo and behold what you might find is, well, that applicant has a number of professional licenses, which we know is a great credit signal. They’ve got great stability of residence, which we know is a great credit signal. They’ve never declared bankruptcy. They always pay their bills on time, which we know is a great credit signal. And so, one of the [inaudible], I mean, the conversations that we have with the lenders, I feel for these lenders, because, on the one hand, we tell them, “You’re not allowed to discriminate.” On the other hand we tell them, “You’re also not allowed to ask is this applicant black? Is this applicant a woman?” And so, to date, I think that no one has been making these kinds of tools for financial institutions who are disposed to being fairer to historically disadvantaged groups. But the good news is that the technology has advanced considerably and the regulators have signaled more of a willingness to allow you to use that information if you’re doing it for the purpose of improving outcomes for folks who have been historically left out of the system.
Scott: That’s really cool. So, your algorithm could look and see, “Oh, there’s a population over here.” Presumably women, or a lot of them are probably women, and drill down, and run separate types of analyses on them, and come up with that 33% of that pool is actually a great borrower for the lender.” That’s the heart of what FairPlay is doing?
Kareem: That’s exactly right. We re-underwrite all of the declined applicants to try to identify the dimensions on which they resemble good applicants who are approved. And what we find, as I said, is 25 to 33% of the time the folks that are just on the other side of a lender’s disapproval threshold would have performed, at least, as well as the riskiest folks they approved. And that makes intuitive sense, right? I mean, you walk into a financial institution and they tell you, “Hey, listen. Our FICO cutoff is a 680. We don’t approve anybody below a 680.” Well, you telling me that there aren’t some 679s and 678s who are headed for 685 and 690? Of course there are. And there are 690s who are headed for 650, right? And so, being able to tease that out at the margin is really where we’re able to create a lot of value and a lot of fairness for our customers.
Scott: Hey, it’s Scott Orn, and we’re going to take a quick break from the podcast to give a shout out to the Kruze tax team. Gosh, it’s so nice to have an in-house tax team, I can’t even tell you. We have some really amazing professionals on team. It’s over, I think it’s 13 people now. And we do everything from your Federal and State Income Tax Return, State Franchise Tax filings, R&D tax credits, those are pretty popular these days. And guess what? They’re there for you when you go through diligence. A lot of people don’t know this, but you actually go through tax diligence, not just operational, financial diligence, but you do go through tax diligence. So, it’s nice to have Vanessa Kruze on the phone with your VCs, and with the accounting firm they hire to diligence all your stuff, and the law firm they hire to diligence all your stuff. Vanessa knows what she’s doing. She’s done this a million times. And it’s not just Vanessa. We have a really great team of tax professionals that will do those calls too. It’s sometimes the difference between getting your round closed, or having it take another two weeks because something was disorganized, and tax compliance wasn’t done correctly. We hear those horror stories from clients that come to us. So, hey, if you want Kruze’s tax team on your side, we’re here for you. Check us out at kruzeconsulting.com. Thanks. Before we turn the mic on, I was coming at this from this whole avoid risk, avoid litigation. And you have an amazing point, which what we talked about the making more money aspect of it a little bit, we could talk about that a little bit more. But there’s a third benefit to the institution for adopting FairPlay, and it’s an employee one. Maybe share with the audience what you’re hearing from employees of those institutions that have adopted FairPlay.
Kareem: Yeah. So, I think fair lending 1.0 was, “Don’t inquire too much into the fairness of your lending practices. If you do inquire and you find there are disparities, sweep them under the rug, or try to come up with clever statistical, and legal justifications, that legitimate those disparities.” And I think the approach that we’ve taken at FairPlay, which we call fair lending 2.0 is, “No, you’ve got to look. And when you find problems you’ve got to commit yourself very seriously to addressing them.” And, what we find is not only do you make more money by doing that, but you enhance customer satisfaction, you enhance brand reputation, and loyalty, the regulators take a favorable view, and it helps you recruit talent. Because the smartest people today want to work for and patronize institutions that reflect their values. And so, what we have found is, we sometimes glibly say, “FairPlay makes fairness pay.”
Scott: And I can totally relate to that because, in this day and age, especially post-COVID, where you hear about the Great Resignation, or people moving jobs. Company culture, company reputation, the pride that people take in working somewhere is so much more important than it was even pre-COVID. And, especially the lending institution, this is one of the biggest levers a lending institution can do to enhance the reputation, and make it a better place for people to work. It’s so much better if your bank, or community bank, or credit union is on the front page of the Wall Street Journal because you’ve taken this new, proactive approach to underwriting loans, and you’re seeing better results, and you’re improving all these people’s lives, than working at the cigarette company. I mean, that’s just the world we live in. It’s so positive, it’s an amazing trend in our society, I’m very thankful for that. But I mean, what a big lever you can help these institutions pull.
Kareem: We named the company FairPlay, which is this phrase that came from a Frenchman named Pierre de Coubertin, who first used it when reviving the modern Olympics. And he said that fair play was the idea of uniting people to compete on a level playing field with a good spirit, with an attitude that displays excellence, respect, modesty, generosity, and friendship. And those are the values that we seek to embody at FairPlay. And they’re the values that, frankly, our customers choose us for. And, it’s not just because we’re altruistic, it’s because these institutions recognize what you said, which is, as Richard Kepler, the former head of HBO used to famously say, “Culture eats strategy for breakfast.” Right? Obviously, you need smart decisions and great execution. But the idea that people at the top decide what’s right, and people at the bottom follow the leader wastes the enormous pool of talent, and ideas, and creativity, and experience. And so, if you want to win you’ve got to make the workplace, and you got to make the products safe, and welcoming to people. And, if you end up trying to avoid speaking the truth to stay out of trouble, you actually end up getting into trouble.
Scott: Totally. And, I mean, your story about your mother, and where you family’s come from, it’s got to be so rewarding. I would bet that there’s a lot of people who work at FairPlay, who’ve had similar life stories to you who feel that calling too. When you have your pick of, or you know so many of the great AI, and technology analysts in the world, I mean, is it easy to recruit at FairPlay? Or maybe it’s never easy, but do you find that sense of mission internally at your company too?
Kareem: I mean, what we’re finding is the smartest, most talented people want to work on problems that are substantively interesting, and change the world if you get them right. And, systemic bias is one of those issues, right? It’s a persistently difficult issue that we have struggled with here in America. And, not only that, but as you say, almost everybody has had some experience with unfairness, either being treated unfairly themselves, or seeing a loved one be treated unfairly. And, look, if you just read the zeitgeist a little bit, society is demanding fairness now in a way that it never did before. And consumers care that the businesses they patronize are fair to their employees, their customers in the world. And so, [inaudible] it’s never easy to recruit, but I think that we’ve got a better shot at it than as you say, the folks developing the next laundry delivery app or whatever.
Scott: Yeah. There’s a flip side to what you said. I think it’s powerful, and it’ll be hard for me to articulate this, but it’s not just looking around the world and seeing unfairness. I think many of us have experiences where we’ve seen something with like what you’re building open up the world to someone who was disadvantaged or in a not so great position, or in an unfair position, and seeing them flourish. And I go back to my undergrad at Cal Berkeley, where affirmative action was still in place. And, I didn’t quite understand at the time until I got there. And then people in my fraternity, or in my dorm, who came from just different situations, not as a cushy situation as me, frankly. And, I saw them just their talent explode at Berkeley. And beyond, go to the best medical schools, and the best law school. And what a positive impact they made on society. And, if we were not open to these types of approaches, or alternatives, those people might be left behind. And, really, our society suffers from that. And so, I think that’s probably another aspect. Forget the unfairness, think about the potential of all these people who can be unleashed. And especially in personal finance, it’s so important. You buy a house, you become a better citizen, you become part of your neighborhood, your kids go to better school. There’s so many positives with getting lending underwriting correct and unleashing those people. So, I love what you’re doing. I’m very happy for you.
Kareem: You’re very kind and you’re absolutely right. I mean, I think credit is the sine qua non of modern life. And so, helping lenders say yes to more good borrowers just gives more folks a shot at the American Dream. And so, what we find is that our customers inside of these financial institutions love the ability to both contribute to their bottom line and to their communities in that way. And, employees wake up every morning, psyched to go build a product that gives more people a shot at the American Dream.
Scott: Yeah. I love it. Kareem, thank you so much. This is amazing. Maybe you could tell everyone how to reach out to you if they’re working at a financial institution or they’re interested in working at FairPlay? How can they get in touch with you, especially, maybe on the customer side, if someone’s working at a financial institution and realizes they should be talking to you, what do they do? How do they reach out?
Kareem: Yeah, you can always find us at www.fair-play.ai or info@fair-play.ai. We are looking for anyone that wants to help us build a fairer future. So, please feel free to reach out if that’s something that’s of interest to you.
Scott: I love it. I’m a huge fan. Congrats. And I’m glad you took that leap. You’re building something that is going to be really important and I can’t wait. I know you shared with me the amount of customers are signing up, it’s amazing. So, I’m really happy for you, and best of luck. And I look forward to maybe recording another podcast 18 months and being, “Oh my God, we’ve got 10 of the top 10 banking institutions in the United States using us now.” And, what an incredible future.
Kareem: Well, thanks, Scott. Your partnership and support for the many years that we’ve been toiling away in venture-backed startups has been absolutely essential to our success. We’re grateful to you and the team at Kruze, and we love the podcast, and love the work you do too. So, thank you.
Scott: Thank you, sir. And give my best to your lovely life partner. Melissa is an amazing woman, and I really think she’s just an amazing woman. I think very highly of her and I’m very grateful we got to work together for her time.
Kareem: Thank you, Scott. She sends her best to you and the Kruze team.
Scott: All right man.
Kareem: Take care. Thank you.
Singer: (Singing). It’s Kruze Consulting. Founders and Friends with your host, Scotty Orn.

Kruze Consulting offers solid financials to help de-risk your startup’s next venture capital round. Work with a startup expert! Kruze bookkeepers will help your startup have accurate, up-to-date financial statements that you can use to manage your business’ growth and cash flow.

Explore podcasts from these experts


Important Tax Dates for Startups

  Talk to a leading startup CPA