Quantcast

Data Quality Finally Gets Its Due in Mortgages

MAY 22, 2014 3:53pm ET
Print
Email
Reprints
Comment (1)
Twitter
LinkedIn
Facebook
Google+

For at least 30 years it seems the mortgage industry has pursued a recurring theme. The battle for market share and short-term profits has led to IT dollars pouring into systems for quickly processing and selling or securitizing loans, and servicing loans. Unfortunately, controls on data integrity have not garnered an appropriate share of IT budgets.

The result is that wonderful systems are populated with less-than-wonderful data. The fallout is manifested in misevaluated risks, volatile bond prices, misguided collection strategies, extensive litigation and damages, and heightened government involvement, scrutiny and fines.

Data integrity problems reportedly cost U.S. businesses $600 billion annually. Recent significant losses can be traced directly to bad data quality. Prior to 2006, the industry primarily relied on enforcing lender representations and warranties. Recent widespread litigation has shown the immense costs of relying on cures.

Prevention is a better solution. As a result, quality initiatives are taking place in the market, and the government-sponsored enterprises, investors, and regulators are demanding increased data integrity.

"Data integrity encompasses the completeness, consistency and accuracy of captured and reported data," says Michael Trickey, managing director of Berkshire Group, an advisory firm with expertise in mortgage secondary marketing and securitization. "This includes data collected on loan applications, during underwriting, upon funding, in secondary and securitization activities, and during servicing and surveillance activities."

Each step or process generates a wealth of information to be either collected accurately, or perhaps inaccurately or not at all. Businesses can reap huge rewards by determining what data should be collected at each step, and putting in place electronic and human systems to ensure data integrity is achieved and maintained. They can also help avoid regulatory, legal, and reputational problems and costs by capturing the right information for maintaining controls over lending, selling, securitizing and servicing activities. Quality data supports business decisions and aids compliance.

On the business side, data supports the use of predictive models used for 1) designing products and underwriting standards, 2) pricing loans and servicing rights on a risk-adjusted basis, 3) implementing marketing strategies, 4) setting collection campaigns and predictive dialer strategies, 5) developing loss mitigation criteria and contract terms, 6) setting bond levels and support requirements, and 7) performing surveillance. If the data used for building and calibrating the models lacks integrity, the models will be faulty.

On the compliance side, quality data helps to detect and head off problems. This role has taken on even greater importance with the Consumer Financial Protection Bureau mortgage servicing rules that took effect on Jan. 10. These rules amend Regulation X, which implements the Real Estate Settlement Procedures Act and Regulation Z under the Truth in Lending Act. The new rules impose stringent compliance audits and fines for noncompliance.

The industry has little in the way of standardized data criteria, definitions and formats. Even within firms, data integrity may be challenged by legacy systems and employees, competing department or business unit definitions, lack of focus and definition from management, failure to keep up with changing concepts and regulatory definitions, or a host of other issues.

So how can lenders accomplish a comprehensive data integrity and quality program? The solution involves not just technology, but also proper business processes and workflows. From a technology standpoint, the solution appears to be simple. However, applying just technology becomes a little more challenging in today’s heavily integrated environments.

Any solution must have two components: a system; and experienced analysts to revalidate the data, flag changes, identify errors and provide opportunities to fix those errors, confirming accuracy in the loan file. This means having a system and procedures in place to reconcile data throughout the loan file and recognize discrepancies and errors, and having procedures to quickly act and repair the data.

Firms often use third-party testing of data integrity and of overall origination, sale, and servicing practices to vet and resolve potential problems and to circumvent natural internal rivalries that could reduce effectiveness of testing.

Testing frequency depends on the severity of known problems, but should at least happen quarterly. Mechanisms for tracking data flows are critical and should be documented to facilitate quick corrections and to avoid repeated mistakes.

In a proper data integrity review, loan files should be bookmarked and documents inventoried. A document sufficiency review ensures pertinent documents are included and that the information in the documents is consistent and logical. This review is dependent on the documents each firm requires for its ultimate portfolio, disposition, or exit strategy. If documents change, resulting in data changes, there must be systems in place to ensure the impact of those changes are appropriate and the risks associated with document changes to a loan are fully understood.

"No matter what, do not underestimate the data quality problem nor the effort required to resolve it," Trickey says. "Get in front of data integrity problem sources before they can undermine all areas that rely on the data."

Michael Richardson is the managing director of Cognitive Options Group LLC, a consulting firm specializing in mortgage due diligence and compliance reviews.

Comments (1)
Everyone desires high quality data, but you are right that it requires a multi-step program. When launching a data quality program, it's important to focus on technology, people and process. All three must be in place to get the desired end-result and avoid the costs associated with dirty data. Check out this infographic that better illustrates the point of why bad data must be dealt with on a proactive basis www.qas.com/data-quality-infographics/knock-out-dirty-data.htm
Posted by | Wednesday, July 09 2014 at 1:50PM ET
Add Your Comments:
Not Registered?
You must be registered to post a comment. Click here to register.
Already registered? Log in here
Please note you must now log in with your email address and password.
  • Top 50 Service Providers
  • Lead Gen Logistics
  • Commentary & Opinion
Twitter
Facebook
LinkedIn
Already a subscriber? Log in here
Please note you must now log in with your email address and password.