FEB 11, 2013 4:27pm ET

Methods for Quality Control

FEB 11, 2013 4:27pm ET
Print
Reprints
Email

While having an effective quality control program in place is important, the first order of business is actually knowing where the problems are. This knowledge makes sure your organization makes the proper changes and adjustments.
Quality Mortgage Services president Tommy Duncan made a presentation at the NAMB National conference in December on the errors his company is finding when it examines mortgage applications taken in the past two years.
Things have improved in terms of overall application quality because of three factors: the dominance of refinance production, more companies doing prefunding quality control, and the fear of government oversight and costly legal settlements.
Over 90% of the applications taken in 2012 that QMS examined were loans that were sellable. Just 9% were loans at risk of repurchase while 0.34% were loans at risk for fraud.
By loan type, over 92% of conventional loan applications taken in 2012 were sellable, while just 83% of FHA loans met the criteria.
Reinforcing Duncan’s point about refi apps being better quality in general, he noted that conventional refi loan production over the last two years outnumbered total FHA loan volume where the bulk of the business was purchase.
In a follow-up interview with Origination News after the show, Duncan noted refi apps (especially rate and term refis) don’t require as much information as purchase apps. While there are some full doc refi apps out there, in general there are fewer opportunities for lenders to make mistakes. The full document purchase loan on the other hand, does provide more chances for problem to occur. Cash-out refi apps also are more prone for possible errors that could lead to fraud.
But if any one thing jumped out from QMS’ findings for 2012 versus 2011 is the big gain in application defects involving appraisals.
For 2011, this was at 9% of applications; at the time Duncan made his presentation in 2012, it was over 15%.
Even with the big jump, appraisals remain third among QMS’ findings, behind disclosures (second in 2011, first in 2012) and problem in filling out the initial 1003 (first in 2011, second in 2012).
The reasons cited for the jump in problems with appraisals include lenders starting their own appraisal management companies; more technology available to find things wrong; the lender or AMC not performing good QC, because of inconsistency in automated valuation methods; a fast-changing housing market; and too much business resulting in sloppy work.
Duncan said one of the issues for companies that have their own AMC is allowing production to influence the panel.
As for AVM issue, he noted there are more robust tools with better data than simply using an AVM as part of the QC process. In its review, QMS is using “state-of-the-art technology to find things wrong with the appraisal to provide feedback to the lender,” he said.
QMS’ data did find that while appraisal defects on FHA loans increased by two percentage points in 2011 over 2012, for conventional loans, the gain was to 17% (based on updated data since the NAMB National presentation) from just under 10% and were the No. 1 finding for last year. For 2012 conventional purchase loans appraisal problems were found in 15.7% of apps, but for conventional refinancings it was 18.2%.
Duncan’s predictions for 2013 include that the quality scores on conventional loans will decline as the market shifts back to purchase for the reasons that were cited above.
Secondly, mortgage fraud findings, in his data currently at about 0.3%, will go back over 1% in the purchase market.
Finally, the quality scores on appraisals will not improve in large part because the models will not take in the increase in home prices in much the same way they lagged during the downturn.
“Quality control is in a transition, but the quality control people don’t know that yet,” stated Rebecca Walzak, president of Looking Glass Group and rjbWalzak Consulting. She explained that all of the regulatory changes, plus the new requirements placed on originators by Fannie Mae and Freddie Mac, means the way the mortgage industry used to do quality control doesn’t work anymore.
QC was done according to the guidelines all during the boom period and it did not prevent any of the problems from occurring, Walzak said.
The QC control staff needs to communicate better to management on what is and is not working. Management now has to take that information and make changes to origination practices.
This is what had been going on before, but it was not effective. QC had been a loan-by-loan check, she said; if an originator was making a common error, all management would do is talk to that originator.
But QC had not been telling management of the bigger picture, that it was not only a particular originator making the mistake, but that it showed up in a significant number of files. Thus training has to be done and new processes should be implemented.
“The reason that doesn’t occur is because there is no standardized way to collect information” across the industry, she said. Everyone has a different checklist, with different emphasis.
“If you’re not collecting consistent data, you can’t analyze it consistently,” Walzak explained.
Another issue she has seen is with the sampling methodology, which she termed “a mess.” Fannie Mae and Freddie Mac allow statistical sampling for originators who produce a certain volume.
But some firms are keying on a certain segment. For example, they pull all broker production. But this means files originated from other channels have less of a chance of being selected for the QC sample and it is not truly random.
There are stratified statistical methodologies available to develop a true random sample as well as identify those areas more at risk and weigh them higher in the sample. These allow for the QC sample to make a statement that is true for the whole organization, she said.
Walzak added that there will always be random errors in loan files no matter what but management has to decide what is the error rate it considers to be acceptable. It must decide what those errors cost it in terms of buyback risk and potential default it versus how many loans it would originate if it tightened up its standards.
“That is the role management should play, not 'we’ll talk to the underwriter.’ They should have the information they need to make real risk decisions,” she said. Without a consistent methodology for sampling, they are unable to make a meaningful determination.
Walzak said she believes mortgage regulators will create their own standardization and “start comparing your results to their results.”
QC people need to get educated on how to do analytics or at least bring in an analyst into their team. Management needs to have an analyst to define what the data are telling it.
There is nothing in the secondary market guides or from regulators saying what activity are you trying to test.
CFPB is going to into lenders and asking them to show them their fair lending policy. Then it is asking lenders to show how it tested to make sure it is being carried out and the lenders are “flabbergasted” because this is not what they have been doing.
There has not been a formal announcement from regulators, but just some of what Walzak termed “soft and mushy” statements. Those statements warn lenders they need to do QC different than in the past.
She said the secondary market should spell out specific standards, but the industry is not interested in such things. It is a fear factor for the industry.
She also believes the reverification process is not working as constituted. All it is doing right now is adding cost. “If I had my way, I’d scrap it and start all over,” Walzak said.
Sanjeev Dahiwadkar’s company IndiSoft has come out with a product to assist in quality control in the new era of tighter Consumer Financial Protection Bureau enforcement. He said it gives clients an out-of-the-box product which allows them to comply with industry standards established under the Dodd-Frank Act. But clients can also bring in their own standards as well.
It is a system that allows lenders to validate that their QC function is being performed properly and allowing them to document that fact. Dahiwadkar said in today’s world, not only are the lenders being checked, “but to make matters worse, there is a check on quality control. So on those who are checking quality, there is another group sampling their work.”
The new environment requires lender to be “operationally savvy,” he said, and technology can play an important role in this. That is where IndiSoft saw its opportunity to develop this product.
The product is complementary to existing loan origination (as well as mortgage servicing) systems. It is based on open architecture using common industry data standards. “Our focus is how do we reduce the cost of ownership to our client and give the maximum return on investment,” Dahiwadkar said.
Its intended user is any stakeholder in the process. Internal implementation depends on the client, which decides who needs to get exposed to the information during the process and to what degree. It can even interface with third-party vendor’s technology.
The QC technology can run from prefunding through post-closing; which stages it is used is up to the client. Work cues are dynamic and totally reflective of the client’s business processes, Dahiwadkar said.
Managers have a dashboard to see what the cases are and who is responsible for what on the file. They can use that information to adjust individual workflow.
IndiSoft works with a number of law firms on interpretation of CFPB rules. But the open architecture allows for clients to implement their interpretations as well.
It helps companies keep their employees up to date as well. “That is where power of our technology comes in. Not only do we help them stay compliant, we help them train their employees,” Dahiwadkar said, adding users can see a link to the guideline or interpretation involved.
“We are empowering our users with the knowledge about they are supposed to do and how it needs to be done.”