CFPB catches flak from banks, credit unions on risks of AI

The Consumer Financial Protection Bureau on Thursday got an earful on the use of artificial intelligence from members of three advisory boards who said traditional financial institutions are at a competitive disadvantage against nonbank fintech firms.

Federal financial regulators have pushed banks and credit unions to adopt new innovation such as AI, most recently in a statement related to anti-money-laundering procedures. But bank executives and other members of the panels warned that AI poses risks, and companies that present a concern are not regulated. They urged the CFPB to conduct more oversight and research.

“One of the reasons [banks] sometimes take a bad rap at not being innovative is because we do have regulators that do come in and look at our vendor management, but what’s missing is [oversight] of those companies currently outside of the financial services industry,” said Bryan Bruns, president and CEO of the $150 million-asset Lake Central Bank in Annandale, Minn. “How are you going to beef up the regulatory enforcement?”

Acting CFPB Director Mick Mulvaney
Mick Mulvaney, director of the Office of Management and Budget (OMB), speaks during a Bloomberg Television interview at the White House in Washington, D.C., U.S., on Friday, Oct. 26, 2018. With U.S. growth last quarter beating estimates and inflation contained, Mulvaney said the latest figures probably "takes pressure off of the Fed to raise rates as they've indicated they want to do." Photographer: Al Drago/Bloomberg
Andrew Harrer/Bloomberg

The CFPB invited the feedback at a joint meeting of the Consumer Advisory Board, Community Bank Advisory Council and Credit Union Advisory Council. The advisory panels, which are statutorily required to meet twice a year, were recently reconstituted after acting CFPB Director Mick Mulvaney fired the prior members of the boards and reduced the size of the new Consumer Advisory Board.

Banks and credit unions are required to examine third-party vendors but many are overwhelmed by the sheer amount of regulation required to bring in an AI vendor, council members said. With fintech companies not required to perform the same level of validation, they said, there is an unlevel playing field.

Rick Schmidt, president and CEO of the $170 million-asset WestStar Credit Union in Las Vegas, said that smaller financial institutions have cost hurdles to overcome.

“The bureau needs to ensure that everyone is playing from the same set of rules,” Schmidt said.

“One of the key roles the bureau plays is making sure that there is a framework that creates a level playing field for companies to innovate but also to ensure that if a community bank has a disclosure requirement, then an AI-based company or fintech company has the same requirement,” Schmidt said.

Members described the difficulty of overseeing third-party vendors that create AI applications used in chatbots, credit underwriting, fraud detection, regulatory compliance, risk management and robo-advisers.

Jeanni Stahl, a senior vice president and chief risk officer at MetaBank in Sioux Falls, S.D., said the $5.8 billion-asset bank uses a machine learning tool for fraud management that has been beneficial, but also raises concerns.

“There is a black-box component to it where it’s very difficult for us as a bank to know what goes into that machine learning and if we were to have to explain to regulators or consumers how it works, it would be difficult,” Stahl said.

Other board members said the use of AI in loan underwriting decisions had the potential for bias and that the CFPB needs to conduct more research.

A recent study by researchers at the University of California, Berkeley found that fintech lenders using algorithmic scoring charged minority borrowers higher interest rates.

“Even if a company has the best intentions of following fair-lending principles, I think it’s debatable that some AI can make credit decisions without bias because these platforms are analyzing thousands of data point,” said Liz Coyle, executive director of the nonprofit consumer advocacy group Georgia Watch, who sits on the advisory board.

Data integrity was also at the center of the discussion on AI.

Some suggested AI may give a false sense of security because consumers do not know what information is being used or how to correct what may be inaccurate.

Credit reporting agencies are required to correct inaccurate information, but it is unclear how a consumer corrects mistakes by entities that are unregulated.

"What is that process when you have information that may not be verifiable and how can the consumer remedy that?” asked Maureen Busch, vice president, compliance and CRA officer at $1.7 billion-asset Bank of Tampa in Florida.

Consumers are already using AI to monitor and manage their credit scores and finances, and startups say that such applications can have a big impact on the lives of unbanked or underbanked consumers.

“AI can take complex decisions and automate the parts that are incredibly difficult for consumers,” said Sophie Raseman, head of financial solutions at Brightside, a San Francisco startup, and a member of the Consumer Advisory Board. “I urge the bureau to think about positive applications to help consumers manage how cash comes in and goes out, pay bills and start escaping that cycle of living paycheck to paycheck.”

For reprint and licensing requests for this article, click here.
Fintech regulations Fintech Artificial intelligence Machine learning AML Credit unions CFPB
MORE FROM NATIONAL MORTGAGE NEWS