A typical mortgage loan captures thousands of pieces of data, and the potential for inaccuracies when data is entered, re-entered or overwritten in a lender's loan origination system is huge. To combat the problem, lenders rely on manual labor, either in-house or outsourced, to check and recheck loan data across multiple documents for completeness and accuracy.

The practice of "stare and compare," by which a human being looks back and forth across two or more documents to verify that the information is consistent across document types, is time-consuming and error-prone, not to mention costly. Plus, using this approach, it's only feasible for lenders to send a small percentage of loans through quality control.

Recent regulations have heightened lenders' efforts to ensure data integrity and loan quality, and increasing staffing to address the issue has resulted in lengthening the time it takes to close loans and increasing the overall cost per loan. Now is the time for lenders to apply the principles of straight-through processing (STP) to the loan origination process, where a technology-enabled model reduces manual intervention and extracts data from the loan documents to move quality control to the front of the loan process.

As originally conceived, the concept of straight-through processing enables the entire trade process for capital market (buying and selling of long-term debt or equity-backed securities) and payment transactions to be conducted electronically, without the need for re-keying data or manual intervention. The original concepts of STP are now applied in financial markets to improve the certainty of settlement, minimize operational costs, and reduce systemic and operational risk. The mortgage industry can realize similar benefits, and others, by applying the concepts of STP to the loan origination process.

By introducing automation, including document recognition, data extraction and rules engines, along the entire loan process, lenders improve the overall quality of loans, speed the entire loan life cycle, reduce the cost per loan, and reduce compliance risk. Automated data extraction technology, for example, makes it easy to compare data in the system with the data on the original document, and alerts the lender of discrepancies in the data, as well as missing data or missing documents, immediately.

In this model, human intervention is required only when something that is flagged by the automation engine needs to be validated. For example, rather than send an application to an underwriter, the data on the application could be extracted and put through a rules engine for analysis. Only if the application has a piece of information outside of the rules parameters would it then be sent to a human underwriter for review. This standardizes the process, increases productivity, lowers costs and lowers production risks. The technology would also keep a historical record of any changes made to the data, automatically creating and maintaining an audit trail to assist with compliance requirements.

With recent changes dictated by the Dodd-Frank Act and the Consumer Financial Protection Bureau, lenders must reevaluate their operations to ensure they have the data integrity that ensures compliance. To comply with these new regulations, mortgage lenders and servicers must provide adequate assurance that their workflow processes support data integrity. Lenders can demonstrate this by moving their operations toward a data-centric model, where data extraction and validation is the cornerstone of data integrity.

By moving to a technology-enabled straight-through processing model, where data integrity and loan quality is achieved through an exception-based processing model, forward-thinking lenders will be positioned to satisfy the many new requirements imposed by regulators and will reap the additional rewards of reducing the time it takes to close a loan and reducing the overall cost per loan.