Track 4: How to reconcile CFPB's innovation & enforcement stances

Today's CFPB is vocal and active both as a regulator and an innovation driver. This session will explore current guidance and enforcement trends, and also dive into the CFPB's new effort to "promote competition and innovation".

Transcription:

Chris Drayer (00:06):

If you would take your seats, please. My name's Chris Drayer. I'm the CEO of Evaluate. We're the track sponsor for how to reconcile CFPB'S innovation and in enforcement stances. So if you'd like to learn about that, you're in the right spot. If you'd like to grab the envelope on your seat you can register for a borrower retention session with our data team. With that, I'd like to introduce Bonnie Sinnock.

Bonnie Sinnock (00:56):

The best way here. Want me down here? No.

(01:04)

Hello. Welcome to our panel. I'm Bonnie Cynic, Capital Markets National to live me today, our Tony Alexis partner at Proctor, Christy Johnson, CEO of Alignment and Ari Karen, partner and Head of Litigation, Labor and employment at Mitchell Sandler. Should I repeat that?

(01:27)

Hello and welcome to our panel. I'm Bonnie Sinnock, Capital Markets Editor at National Mortgage News. With me today our Tony Alexis partner at Goodwin Proctor, Chrissy Johnson, CEO of Alignment Advisors, and Ari Karen, partner and head of Litigation, Labor and Employment at Mitchell Sandler. They may tell you more about their respective backgrounds and how they pertain to today's topic as we go, but I wanted to dive right in because we've got a lot of ground to cover in a short time today. Let's start with Tony, because he can address one of the topics we need to discuss today being this is part of one of the real world uses of blockchain and AI track. He's gonna tell us about a consent order that bank lenders in particular might want to be aware of because it has ramifications for artificial intelligence.

Tony Alexis (02:17):

Yes. Thank you for having me. The other day, the CFPB issued a consent order against Bridget, or I forget digit. And what happened was it was a FinTech company that had created a product in which it would sweep out your account to have savings. And it would take that money and put it in a separate account. The issue and what they represented to the consumers was that it was flawless and you would never have a need to overdraft because what they had was an algorithm and that the problem with the algorithm was it had malfunctioned badly, because they relied on stale information or information that was not reliable because, the consumer took out more or had spent more on his account at cetera, and ultimately they overdrafted and they realized that the algorithm had created the overdrafts.

(03:35)

Well, the problem was they didn't pay back the overdress and the CFPB find them and I think it was interesting because, the type of case, I really thought it was like, it was just a technical oversight, and the company needed to pay their consumers back, pay them back, and let's go on. But ultimately, the CFPB under the director, I think, showed what he considers to be the issue with technology and the issue with AI and the issue with technology driven products. He find them, but he mentioned the fact that they had a poorly functioning algorithm. But it was interesting. I would think if you look through the Office of Enforcement, I would say that for the most part, they haven't had enough time to work their strategic priorities into a fashion in which you would see the outcomes at this particular point.

(04:57)

But nevertheless, I would think that you could look to the director director Chopra in what people in the consumer space and people also in industry are calling regulation by press release. That is every week he says something, he gives an advisory opinion about how a particular market or regulation works. He issues circulars. He has blog posts, and the issue is the Office of Enforcement may not be able to get there now because the cases that the Office of Enforcement is that they're working on maybe a year, two years, in some cases, in one case, maybe three years old. The issue for him is he frames whatever he can get out of it into messaging what he wants to message. I think if you also look at it and see his statement that he made at the time that he issued a 10 22 order for information for the big text and that was, you know, Facebook, Google, etcetera.

(06:24)

The issue is he wants to study how they operate and what they sell their data for and what they use it for. I think the other day he cautioned digital marketing. Again, he was saying, normally you would be a third party service provider, and it might be the case that pursuant to the statute that you might not have any type of culpability or liability in the ecosystem because, you know, advertisements are supposed to be one of those things that the Consumer Financial Protection Act wrote in a manner that, look, you're just being, you're just doing what you're told, and therefore you can't be held liable for it. I think he's been very stern about that and in the digital marketing piece, he gave a warning to the digital marketing because ultimately, unlike some aspects of marketing there, they're looking at, when you look at digital, you're looking at the behaviors of the consumer, and then you market and you target the you know, the activity towards the consumer as it relates to a particular, you know, market or segment of the market.

(07:57)

He seems to be worried about that. One final thing before I turn it over today was I think the 50th anniversary of public citizen. It might not have been, but he spoke at public citizen Ralph Nader's old shop. He gave a presentation in which he was ostensibly speaking about some of the service providers and the technology payment space like PayPal, Venmo, Zell, et cetera. One of the critical things that he shifted to at the end was he doesn't want the bureau or people to proceed them to be standing on the sidelines while these particular technology companies are producing harms, and the CFPB won't be prepared for the harms so quickly. So my sense is it was a warning shot that I think now that they may have enough clearer space and enforcement to start picking up some of those matters and the issue is, I'm not saying it's around the corner. I'm not saying it's yes, but I am saying don't be surprised if a CID comes to your, your workplace and you have a Hard Facebook type of an issue because I think that's where it's headed.

Bonnie Sinnock (09:36):

Okay, Thank you. That it's an example of the CFPB, it's an enforcement role. So I wanted to turn it over to Chrissy to talk about how the CFPB can also be encourage innovation.

Chrissi Johnson (09:49):

Yeah, I know the CFPB has been called out a lot recently for them calling out big tech. But I do think we have to remember that at its core, the see if its mission is to promote a marketplace for that is safe and protects consumer and consumers can be competitive. So that translates honestly to consumer friendly innovation. So when we're sitting here spinning our wheels, and sometimes I do think we get a little bit in historys worrying about what the CFPB is going to do, it is worth pausing and taking a step back and saying, is the consumer at the core of all of our practices? Because that's exactly what the CFPB is going to be thinking. So if we put ourselves in their shoes and ask questions like those real talk questions that we may be uncomfortable getting the answer to, is it fair?

(10:40)

Is it deceptive? If it is like even minutely close to either any of those things, we should pause and reevaluate. There's also an opportunity they did, they just revamped the office of innovation as well. I think we should all be encouraged as an industry to engage with the Office of innovation because CFPB isn't necessarily anti tech. They just don't like the black box that oftentimes comes with tech. So if we promote that transparency and think about the innovations that technology can provide, I think again, working through the office of innovation, we do have a really great opportunity because they have seen a lot of these innovations work in practice. They just called out alternative data recently as well so that they know there is a place for it if done correctly and responsibly with the consumer in mind.

(11:30)

Another example that they have out there, the special purpose credit programs that different lenders especially can pursue. That provides certain legal protections around different programs and products to allow that innovation in a safe space to figure out if there are new ways to serve underserved com consumers, specifically those who have been historically discriminated against due to red lining and other egregious activities in the past. And that still persist often today or sometimes today. So as we move forward, again, thinking about it obviously with the rules around compliance in mind, thinking about it also as an opportunity to engage and again, put the consumer first, I do think we have a lot of opportunity.

Bonnie Sinnock (12:17):

All right. Thank you. And I know Ari, I think in our pre-discussion you said there were some nuances in that communication with consumers, and I wondered if you could talk a little bit about that.

Ari Karen (12:28):

Sure. I think when you're viewing the interaction of consumers with technology, particularly front facing technology, it's sort of like a Goldilocks problem. What I mean by that is you can't have too much, you can't have not enough. It's gotta be just right. Specifically, let's say you have a program that interfaces with consumers, and anytime you do that, let's say you don't provide any information. I'll use the example of maybe some type of front facing interface for choosing alone or helping them to choose alone. Well, if you just had out there, you said, okay, do you want a 10 year arm? Do you want a 30 year fix? Do you wanna adjustable? Right? And you didn't provide any information. You remember, you gotta prepare this for the borrower that doesn't understand any of these things, right?

(13:17)

Not the people in this room who deal with this every day. And so if there's no educational component to it, and the borrower selects something and comes back later and says, well, hold on, you didn't tell me there was a difference between these, and the rate would be different. And by going to a adjustable, I had a opportunity, the possibility that the rate would go up and in the fixed I didn't. You don't explain any of that stuff, and the borrower selects the wrong thing. Is that an under unfair deceptive act of practice? Probably, right? Because remember that is judged by a standard, not by what most people would know or not know. It's judged by, you know, a sizeable majority or minority, I should say a sizable minority. So they've said 10% of the population is a sable minority.

(14:01)

So there you go. You know, if 10% of the people wouldn't understand it would select something wrong. You could be talking about UDAP, but then the next problem comes in. So you say, okay, I'm gonna provide some information, but where does it stop? In other words, if you start giving, the joke everyone has about disclosures, does anybody really read them? I'll never forget being back in the 48 closure days, I was defending a lender and I was in court, and I will never afraid I was in federal court in New York and the judge comes up to me, I was about to use the integration clause. That's the clause at the end of a contract that says, I've read everything, I understand anything. I understand everything. I remember going to the hearing and I am about to stand up and say, hey, they can't argue about all these things that understand.

(14:42)

They signed an in duration clause. The judge stops me and he goes, you know, I went to Yale Law School and how long did this closing last? A half hour. So it took me an hour and a half to get through it. And the clients over here aren't even fluent English speakers. So you can imagine that didn't go very well. Right? But the point I'm getting at is like, if you provide too much information, people get lost in it and then they can't understand it, and you have the same problem coming up. So really what you have to do is you have to figure out the right cutoff point where you provide enough information that it's not misleading. It's not incomplete, and it's material and it's key, right? You're giving 'em the stuff they really, really need to know. You don't overwhelm them with so much that it's absolutely useless because no one could understand it.

(15:28)

Right? And that's what I mean by the goldilock problem. You have to get into the middle of it. And there's really no way of avoiding it if you have that interface, because you've got, you can't just let people out there and say, Here's the information, because then it's assumed that there's really no difference in the choices they make. They have to be advised. The other thing I think that is very critical when you are providing this type of information to people or this access, is to make sure there's always access to a human being so that if they get through it and you say, hey, if you have any more questions, if you don't understand if there's anything you need, you've gotta, you've gotta resource that you can reach out to people on. So as much as we have the innovation and the technology, which I think are great, you always have to have, it's kind of like the grocery store checkout line right now that everything is automated and digitized, but you still have that one person for when you can't, figure out how to weigh the spare, I guess on the checkout line.

(16:23)

You need that. You have to have that in technology with respect to consumer finance as well, because of this scenario that people have to be able to get their an questions answered. So I think those are all of the considerations. Again, it does, it is affected by whether it's actually consumer facing technologies that aren't consumer facing, that is on the back end. I think it's less of a concern then it's more of an issue, as Tony was talking about its functionality, about testing, about making sure it works in the consumer facing area, again, you've got the same and you've got different issues, cause again, you've gotta provide the right information. The other thing you gotta do is also think about how you put it into place. If you don't test it with borrowers, if you don't have beta with actual people and you just put it out there, you're really in a huge risk.

(17:18)

Because then if people, there's only so much we can, we can think about, and I've worked with a lot of clients that have done this and it always astounds them as the things they think are so clear that everyone is gonna understand that no one's gonna get confused by, and low and behold, they get it wrong. The first 10 times they put it out there. I've gotten it wrong too, the first 10 times they go out, think no one's ever gonna be confused by this and they are. And so the thing is, to have a good beta program, everyone's always rushed to get their technology out there first. I know it's it's a really important aspect of, investment and innovation. But the testing, especially in the consumer facing is critical because if you don't know where those holes are and you just put it out there, then you're gonna have a real problem.

(18:08)

When it goes wrong, and remember this, it's all recorded. There's no, you know, in a lot of cases as a lawyer, you have that plausible deniability where you can kind of go back and say, well, what if it was this? And what if it was that? When you're talking about this stuff, there is no if then or if that, Right? It is what it is and it's very clear. If a problem arose, and then the question is, what did you know? Were you reasonable in the actions you took to prevent that problem from coming up in the first place? And if you can't establish that you took reasonable measures to protect the consumer, and they see it as rushed. They see it as, you know, not well planned and developed which is very easy to do as you can imagine, in hindsight, after a problem has happened, then you're gonna have a problem. Right? So it's testing, it's thinking about, what's too much, what's not enough, which is right. It's having the access to consumers when problems can arise or when, when consumers may have questions. It's putting all that together in a thoughtful process before you put it out to the public. Those are all critical elements of a consumer facing program and how technology and compliance have to interface there.

Chrissi Johnson (19:21):

All right. I love your example of the grocery store because in addition to all those critical measures that you mentioned, the thing that I think about in the grocery store, when it's a self checkout, they're also looking for people who are breaking the law, underage people buying alcohol, things like that. So it's a system that works until it doesn't. You have to have those measures in place, ongoing measures in place to ensure that you are constantly checking for when it, like, for example, with AI, if you have an algorithm and you've done the testing, you've made sure it's ready to go, you throw it out there. But if you're not having ongoing measures to check for potential AI flareups that are accidentally most of the time built into the system, you're gonna have a problem. So that, I love that groceries. I'm gonna steal that sometime and add on to, in addition to those measures, the ongoing monitoring of whatever product you have out there to check for those flareups.

Ari Karen (20:13):

Yeah, and you're right, it's layers, I think that was a word to used. It's having layers of protection. So it isn't a one stop failure if, if you say there has to be that redundancy built in.

Bonnie Sinnock (20:28):

Yeah. Chrissy, I thought it was interesting that you brought up alternative data. That's definitely, I think something a lot of lenders are looking at, especially now that you know, there's some expansive outreach to borrowers, I would say. And I wondered if you could talk a little bit more about the parameters for that as you see them. You know, where is innovation encouraged and what are the bounds that has to be done in

Chrissi Johnson (20:52):

The, I think alternative data is going to be critical as we move forward and consider the new consumer profile that is continuing to develop the fact. The matter is, we saw this with the revamped qualified mortgage rule, that the way that the law was written was done originally to protect the consumer, of course. But a lot of consumers coming in who are mortgage eligible may not come from spaces that have traditional access to generational wealth. They may have, so they may not ha they may have lower down payments, they may have higher debt to income because of higher student debt, things like that due to historic discrimination and other issues there being in underserved communities. So as we're discovering new ways to measure a consumer's data and their ability to repay and engage in the financial system, I do think it's critical that we think of those different pieces that go into what a consumer's profile look like.

(21:52)

But again, doing it in a way that is ongoing, making sure, I don't know the answer to the what parameters are going to be in place, but we have to, that's what we have to be looking for. That's the data when they get to, when are there certain things that actually lead to more delinquencies? Are there some that we've used in the past like DTI that is not a great demonstrator of delinquency things like that. So it can go both ways, but it's going to be a very complicated and holistic approach to rethinking how we look at consumers who are now entering the system.

Ari Karen (22:23):

The other thing I think that's super important about the alternative data is the opportunity that it has to really be a game changer. What I mean by that is, look, we've had the qualified mortgage, but anybody who looked at a qualified mortgage, you're like, what, one, two, 3% higher than market. So a lot of people aren't gonna want that just because the difference in rates that they're paying to go to that and that really diminishes the desirability of those products. Now, one of the reasons why is because of enhanced risk associated with those products. But as alternative data comes in now with the new QM rule, right? As alternative data comes in and as it demonstrates, as you were saying it's viability and you start with a new data stream that a addresses and allows more people have access to credit, and now the risk isn't as high. And so those rate differentials start dropping, that's really where the game changer is gonna occur. We don't have that yet. I don't think it's been around long enough or anyone's really figured it out, but whoever does figure that piece of it out, that's gonna allow broad access without hugely enhanced risk on the secondary market, that's gonna be the game changer in this

Chrissi Johnson (23:31):

Data all the way. I agree.

Bonnie Sinnock (23:33):

But one of the things that I would retreat and look back before is, you know, Rohit Chopra put out a bulletin, I forget how long ago in which he spoke about AI and alternative, not alternative, but adverse action in Kowa. A year ago, or maybe a little, well, it was before the pandemic, they had come out and said, Well, you don't have to explain what's in your algorithm, you just have to explain which factors, you know, for example, the car as a down payment or, various factors. And there's been somewhat of a switch as the CFPB has looked at it and now realizes, Oh my goodness, we've given alternative data, We've given AI, we've given algorithm driven underwriting a free hand to be able to say, this person qualifies, this person doesn't, and not have to explain what's in the black box. And then you kind of feel sad for the company. It's just like, hold on, they work for a long time to create the black box. Why shouldn't they have some privacy in it? And the issue is obviously risk.

Bonnie Sinnock (25:13):

I think that's a good point. And it's interesting you do see the mortgage industry looking at that, but Freddie Mac for sure has looked at, you know, that possibility of using AI driven underwriting. I kind of wonder whether that will really move forward until maybe a influential player like that moves with it. But it'll be interesting to see. So I think that's a good topic.

Bonnie Sinnock (25:38):

Yeah, I remember reading it about six months ago, or I forget how long ago, and what they were saying is, well, what do we do if someone has a lot of cryptocurrency? Should that count as an asset? Freddie or Fannie said, No! I mean, so right off the bat, one of the largest things in terms of innovation and being able to move towards digital currency is a large housing underwriter saying, it doesn't count for much.

Bonnie Sinnock (26:17):

I know the CFPB has an information sharing arrangement with Fannie and Freddy, so I thought that maybe on a regulatory basis, is it safe to say that there's some kind of blessing you get if you're following in the footsteps of Fannie and Freddy? Or not necessarily, I don't know. What do you think about that, Christy? Do you have a take on that?

Chrissi Johnson (26:40):

I mean, if you underwrite to Fannie and Fred's standards, it's considered a qualify. It's only Oh, that's true. A qualified mortgage, and that's a black box. So that's a prime example of a black box right there.

Bonnie Sinnock (26:52):

That's a good point.

Chrissi Johnson (26:53):

So it's an interesting perspective because we have been operating, I don't know if the term is AI, but it feels a little bit like AI in a black box for a long time already. So there has for a long time been a push as well from consumer advocates to get that black box opened and to get that data opened. So I don't know if that's gonna be a push as well from this administration. There are a lot of different ways this could go, but It's an interesting question that you ask because that's been the way, that's how we have underwritten mortgages for the past 10 years at least.

Bonnie Sinnock (27:31):

Tony, I wanted to circle back to the example you opened up with, and I remember in our pre-discussion we talked a little bit about that was a bank in that circumstance. But we do see kind of on the non-bank servicing side, we've seen an example of where the CFPB is watching to see how say automated payments are debited from a consumer's account, right? So I wondered, you know, would you say that that type of concern and whether you applied AI to a process like that is something that both non-bank and banks would have to be wary of on the enforcement side?

Bonnie Sinnock (28:06):

My answer is no, and do what you want.

(28:11)

The issue is if it has not been tested, you know, as you pointed out and you know it's not scalable, and it all of a sudden when you have a mistake, you will have a mistake that will repeat itself because it's AI, you know, and if it's AI and it's repeating itself and it's a mistake, then you're gonna have a critical mistake that pretty much harms consumers and I would suspect consumers on the more vulnerable part of the market who had too much money taken out. The issue ultimately is what tests can they perform? And quite frankly, if it's mortgage servicing, you have an avenue, all mortgage servicers are within the office of supervision purview. That means you have based on the region that you're headquartered or your main operations or you have someone to talk to. My sense is, I know the CFPB is the boogeyman sometimes, but you should really find a safe way or a person that you can trust within the CFPB, especially if it's a servicer. You have supervision, you have regulatory advice, you have someone that will spot it first after they've told you whether you can do it or not, or what risk or you know, types of boundaries that you should test it. Ultimately, you know, they'll tell you about various testing. Has it been cleared by audit, has it been cleared by the first line and the second line, etcetera. So my sense is you will be able to do it if you roll it out safely.

Ari Karen (30:20):

Part of that I think is also having an audit and corrective action plan already ahead of time. Because when you catch something yourself and you can address it and respond to it, the regulators, I've found that don't know what your guys experiences tend to not be too harsh on you. When you're able to take the corrective actions when six months goes by, no one said anything that gets a little more concerning, right? Because more people have been harmed and then the questions are, were you set up correctly? And it just, you know, it just kind of piles on from there.

Chrissi Johnson (30:49):

I would also, in addition to that, encourage folks to engage with the third party advocates as well, especially in this administration, under this leadership. Because if you come in with partnerships with the consumer advocates and the civil rights groups who are having constant com communication with the various divisions and leadership at the, at the bureau, and they come in with you saying, Hey, we're interested in this product or this program in helping in achieving our shared goals around home ownership, you then have, it's not just you saying, I'm great, it's always better when someone else is, is amping you up, right? So I, I think those types of relationships, relationships are cur currency regardless, but those relationships really add to it.

Bonnie Sinnock (31:29):

Great. Well, that's all we have time for today. Thank you so much for sharing your expertise today. If please contact our participants if you have any follow up questions. And thanks so much everyone.

Ari Karen (31:41):

Thank You.