The next several years will increase the need for better data management at banks. Banks that have experienced the Dodd-Frank Act’s required stress tests (DFAST) already have encountered that need. With the Basel III international accord phasing in and the new current expected credit loss impairment standard (CECL) eventually taking effect, all U.S. financial institutions are facing ever more demanding regulatory requirements driving the need for enhanced data and analytics capabilities.
Credit data is becoming increasingly integral to stress tests, as well as capital planning and management and credit loss forecasts. To meet regulatory expectations in these areas, though, some banks need to improve the quality of their data and the control they have over it. Effective data management can bring valuable support and efficiencies to a range of compliance activities.
Expanding Data Requirements
DFAST, which is required of banks above $10 billion in assets, is highly dependent on data quality. The DFAST process—including scenarios, analytics, and reporting—requires banks to maintain a vast array of reliable and detailed portfolio data, including data related to assets and liabilities; to customers, creditors and counterparties; to collateral; and to customer defaults.
Under Basel III, banks will need to gather even more data. The requirements call for consistent data sourcing and reconciliation, liquidity management and the capture of data for historical purposes, among other things.
The Financial Accounting Standards Board’s new CECL model for GAAP reporting applies to all banks and will bring implications for data management. Banks and financial services companies will need borrower and economic data, exposure level data, historical balances, risk ratings and data on charge-offs and recoveries. Failure to capture quality data in these and other areas could result in tougher examinations, reliance on peer or industry data, questions about safety and soundness and drops in capital and profits.
Data Management Challenges
Small banks generally have a handful of credit data sources, while large banks can have 15 or more—and the number of sources is expected to grow in coming years as new products are released. In addition, the data often is stored in different formats and might not be subject to any governance or control. It’s no wonder that banks can find it difficult to get a handle on their data, let alone produce a “single source of truth” that can withstand examiner scrutiny.
One solution to this dilemma is a credit data warehouse. A data warehouse can provide a vehicle for controlling and governing an immense amount of data. It allows a bank to easily show an examiner the data that was used for its models, the data sources and how the data reconciles with the bank’s financial statements.
Banks might encounter some obstacles on their way to effective warehousing, though, including the sheer volume of data to be stored. Quality assurance is another common issue. For example, information might be missing or not in a standardized format. Data availability also can pose problems. A bank might have the required information but not in an accessible format.
Overcoming these problems comes down to data governance—how a bank manages its data over time to establish and maintain the data’s trustworthiness. Data management without active governance that is targeted toward achieving a single source of truth isn’t sustainable.
In the case of DFAST, it’s important to resist the temptation to take a short-term perspective that considers only the data required for stress testing. Banks that take a more global view, bearing in mind that the data is used throughout the organization, will fare much better. Such banks build a framework that can handle the new data requirements (including those related to historical data) that will surely continue to come in the future.
Banks also should remember that data management is not a one-off task. A bank might have clean data today, but that data will degrade over time if not managed on an ongoing basis.
Finally, banks should not overlook the human factor. Success isn’t brought about by a database but by the people who are stewards for the data and the processes put in place to audit, balance, and control the data. The people and processes will, of course, be enabled with technology, but the people and processes will make or break a data management program.
Time to Take Control
Effective data management is an essential component of any stress testing endeavor, but data management also has implications that extend to CECL and Basel III compliance and likely will aid banks in coping with many forthcoming regulations and requirements. Banks that don’t yet have control of their data should take steps now to establish the governance, framework and people and processes necessary to ensure the completeness, accuracy, availability and auditability of a single source of truth for both the short- and long-term.