Sink or Swim in the Data Deep End


data-7-1-19.pngCommunity banks risk allowing big banks an opportunity to widen the competitive gap by not investing in their own data management.

It’s now-or-never for community banks, and a competitive edge could be the key to their survival. A financial institution’s lifeblood is its data and banks can access a veritable treasure trove of information. But data analytics poses a significant challenge to the future success of community banks. Banks should focus on the value, not volume, of their information when adopting an actionable, data-driven approach to decision-making. While many community banks acknowledge how critical data analytics are to their future success, most remain uncommitted.

This comes as the multi-national institutions expand their data science teams exponentially, create chatbots for their websites, use artificial intelligence to customize user interactions and apply machine learning to complete back-office tasks more efficiently. The advantage that a regional bank manager has when working next door to a community bank is growing too large. And the argument that the human touch and customer experience of a community bank will make up for the technological gap has become less convincing as younger customers forgo the branch in favor of their phone.

Small and medium institutions are dealing with a number of obstacles, including compressed margins and a shortage of talent, in an attempt to move past basic data analytics and canned ad hoc reports. If an institution can find a qualified candidate to lead their data management project, the candidate usually lacks banking experience and tends to have a science and mathematics backgrounds. A real concern for bankers is the hiring managers’ ability to ask the right questions and fully discern candidates’ qualifications. And once hired, is there a qualified leader to drive projects and their results?

Despite these obstacles, banks have only one option: Jump into the data deep end, head first. To compete in this data-driven world, community banks must deploy advanced data analytics capabilities to maximize the value of information. More insight can mean better decisions, better service to customers and a better bottom line for banks. The only question is how community banks can make up their lost ground.

The first step in building your organization’s data analytic proficiency is planning. It is crucial to understand your current processes and outputs, as well as your current staff’s capabilities, in order to improve your analysis. Once you know your bank’s capabilities, you can determine your goal posts.

A decision you will need to make during this planning stage will be the efficacy of building out staff to meet the project goals, or outsourcing the efforts to a consultant group or third-party software. A community bank’s ability to attract, manage and retain data specialist could be an obstacle. Data specialists tasked with managing more-complex diagnostic and predictive analytics should be part of the executive team, to give them a complete understanding of the institution’s strategic position and the current operating environment.

Another option community banks have is to buy third-party software to supplement current resources and capabilities. Software can allow a bank to limit the staffing resources required to meet their data analytical goals. But bankers need to understand the challenges.

A third-party provider needs to understand your organization and its strategic goals to tailor a solution that fits your circumstances and environment. Management should also weigh potential trade-offs between complexity and accessibility. More-complex software may require additional resources and staff to deploy and fully use it. And an institution shouldn’t solely rely on any third-party software in lieu of internal champions and subject-matter experts needed to fully use the solutions.

Whatever the approach, community bank executives can no longer remain on the sidelines. As the volume, velocity and variety of data grows daily, the tools needed to manage and master the data require more time and investment. Proper planning can help executives move their organizations forward, so they can better utilize the vast amount of data available to them.

Applying the 1-10-100 Rule to Loan Management


data-4-2-19.pngImplementing new software may seem like an expensive and time-consuming challenge, so many financial institutions make do with legacy systems and workflows rather than investing in robust, modern technology solutions aimed at reducing operating expenses and increasing revenue. Unfortunately, banks stand to lose much more in both time and resources by continuing to use outdated systems, and the resultant data entry errors put institutions at risk.

The Scary Truth about Data Entry Errors
You might be surprised by the error rates associated with manual data entry. The National Center for Biotechnology Information evaluated over 20,000 individual pieces of data to examine the number of errors generated from manually entering data into a spreadsheet. The study, published in 2008, revealed that the error rates reached upwards of 650 errors out of 10,000 entries—a 6.5 percent error rate.

Calculating 6.5 percent of a total loan portfolio—$65,000 of $1 million, for example—produces an arbitrary number. To truly understand the potential risk of human data entry error, one must be able to estimate the true cost of each error. Solely quantifying data entry error rates is meaningless without assigning a value to each error.

The 1-10-100 Rule is one way to determine the true value of these errors.

The rule is outlined in the book “Making Quality Work: A Leadership Guide for the Results-Driven Manager,” by George Labovitz, Y.S. Chang and Victor Rosansky. They posit that the cost of every single data entry error increases exponentially at subsequent stages of a business’s process.

For example, if a worker at a communications company incorrectly enters a potential customer’s address, the initial error might cost only one dollar in postage for a wrongly-addressed mailer. If that error is not corrected at the next stage—when the customer signs up for services—the 1-10-100 Rule would predict a loss of $10. If the address remains uncorrected in the third step—the first billing cycle, perhaps—the 1-10-100 Rule would predict a loss of $100. After the next step in this progression, the company would lose another $1,000 due to the initial data entry error.

This example considers only one error in data entry, not the multitude that doubtlessly occur each day in companies that rely heavily on humans to enter data into systems.

In lending, data entry goes far beyond typos in customers’ contact information and can include potentially serious mistakes in vital customer profile information. Data points such as social security numbers and dates of birth are necessary to document identity verification to comply with the Bank Secrecy Act. Data entry errors also lead to mistakes in loan amounts. A $10,000 loan, for example, has different implications with respect to compliance reporting, documentation, and pricing than a $100,000 loan. Even if the loan is funded correctly, a single zero incorrectly entered in a bank’s loan management system can lead to costly oversights.

Four Ways Data Entry Errors Hurt the Bottom Line
Data entry errors can be especially troublesome and costly in industries in which businesses rely heavily on data for daily operations, strategic planning, risk mitigation and decision making. In finance, determining the safety and soundness of an institution, its ability to achieve regulatory compliance, and its budget planning depend on the accuracy of data entry in its loan portfolios, account documentation, and customer information profiles. Data entry errors can harm a financial institution in several ways.

  1. Time Management. When legacy systems cannot integrate, data ends up housed in different silos, which require duplicative data entry. Siloed systems and layers of manual processes expose an institution to various opportunities for human error. The true cost of these errors on an employee’s time—in terms of wages, benefits, training, etc.—add up, making multiple data entry a hefty and unnecessary expense.
  2. Uncertain Risk Management. No matter how many stress tests you perform, it is impossible to manage the risk of a loan portfolio comprised of inaccurate data. In addition, entry errors can lead to incorrectly filed security instruments, leaving a portfolio exposed to the risk of insufficient collateral.
  3. Inaccurate Reporting. Data entry errors create unreliable loan reports, leading to missed maturities, overlooked stale-dates, canceled insurance and other potentially costly oversights.
  4. Mismanaged Compliance. Data entry errors are a major compliance risk. Whether due to inaccurately entered loan amounts, file exceptions, insurance lapses or inaccurate reporting, the penalties can be extremely costly—not only in terms of dollars but also with respect to an institution’s reputation.

Reduce Opportunities for Human Error
An institution’s risk management plan should include steps intended to mitigate the inevitable occurrence of human error. In addition to establishing systems of dual control and checks and balances, you should also implement modern technologies, tools, and procedures that eliminate redundancies within data entry processes. By doing so, you will be able to prevent mistakes from happening, rather than relying solely on a system of double-checking.

Focus On This One Area To Position Your Bank For Success


data-12-27-18.pngWhether it’s compliance with forthcoming regulations or simply giving your customers an enjoyable experience, banks are realizing that one thing is central to achieving successful results across their operations.

Data management and governance has become a central element for banks positioning themselves for the future in a digital-first world and as new credit reporting requirements, like the current expected credit loss (CECL) provisions, are put into effect.

Banks that embrace and establish a robust data governance process will be better positioned to accomplish its strategic initiatives, whether they be in customer acquisition or relationship management, or with efficiently meeting the new accounting standards.


customer-12-27-18-tb.pngHow Banks Can Make Use of Data-Driven Customer Insight
Most are familiar with the algorithms and machine learning employed by big tech companies like Google, Netflix and Amazon. Banks are beginning to employ similar strategies as the competition for new customers and new deposits remains high.

data-12-17-18-tb.pngFrom CRM to CECL: Why Improved Data Governance Is Imperative for Your Bank
Banks know they have mountains of data about their customers that can help deliver attractive experiences on a variety of platforms. But data governance is not only about controlling large volumes of data, it’s about creating trust in the quality of data.

Poor data governance practices can lead to poor decision making by bank management, which is a risk no institution can afford.

No matter what lies ahead for your bank, how your institution manages and utilizes data will be an essential piece to its strategic initiatives and goals.

Now Is The Time to Use Data The Right Way


data-6-29-18.pngMost bankers are aware of the changes that are forthcoming in accounting standards and financial reporting for institutions of all sizes, but few are fully prepared for the complete implementation of all of the details in the new current expected credit loss (CECL) models that will take effect over the next few years.

Banks that act now to effectively and strategically collect, manage and utilize data for the benefit of the institution will be better positioned to handle the new accounting requirements under CECL and evolving regulations with state and federal agencies.

Here are three articles that cover key areas where your board should focus its attention before the rules take effect.


credit-data-6-29-18.pngCredit Data Management
Under Dodd-Frank, the law passed in the wake of the financial crisis, banks of all sizes and those especially in the midsize range of $10 billion to $50 billion in assets were required to do additional reporting and stress testing. Those laws have recently been changed, but many institutions in that asset category are opting to continue some form of stress testing as a measure of sound governance. Managing credit data is a key component of those processes.

management-6-29-18.pngCentralizing Your Data
Bank operations are known to be siloed in many cases as a matter of habit, but your data management can be done in a much more centralized manner. Doing so can benefit your institution, and ease its compliance with regulations.

CECL-6-29-18.pngGet Ready for CECL Now
The upcoming implementation of new CECL standards has many banks in a flurry to determine how those calculations will be developed and reported. Few are fully ready, but it is understood that current and historical loan level data attributes will be integral to those calculations.

Five Issues Bank Boards Should Consider Now


governance-12-18-17.pngThere are a number of significant issues and emerging trends affecting U.S. companies and the economy, both of which are crucial to the health and vibrancy of the financial institutions sector. In this environment, it is imperative that bank executives and board members think about five key issues in evaluating their strategic plans. These are areas in which changes have already taken place or are on the verge of being implemented.

1. Regulatory changes
Financial institutions face a number of regulatory and accounting changes. First, a new rule under the Home Mortgage Disclosure Act requires 48 new or modified data points, which will allow regulators to determine if unfair lending practices are occurring. The new rules promise to be expensive to implement.

Institutions growing via acquisition should consider how adoption of the CECL model will affect accounting for loans, securities and other affected instruments at the target institution. And the new standard for revenue recognition issued jointly by the Financial Accounting Standards Board (FASB) and the International Accounting Standards Board (IASB) could prove more challenging than many institutions realize.

2. Tax reform
Banks have expanded to new markets to generate loan growth in response to a challenging rate environment. However, many have not considered the state and local tax liabilities for loans originated outside of their physical footprint. To date, certain states have not rigorously enforced compliance, but banks could be on the radar as cash-strapped state and local governments face revenue shortfalls.

If Congress passes a tax reform bill that lowers the corporate tax rate, institutions in a net deferred tax asset position will need to realize the adverse effect that the new rate would have on the value of its assets. The impairment of the value of the deferred tax asset would need to be recognized upon the effective date of the tax change, resulting in a corresponding decrease in regulatory capital. Institutions may consider increasing their deferred tax liabilities as part of their year-end tax planning process to minimize the impact of any rate decrease on the value of the deferred asset. Further, the expected impact should be factored into capital planning. Institutions that have made a subchapter S election should also monitor the outcome of individual tax reform in combination with corporate tax reform to determine if the subchapter S election remains the most tax-efficient model for operations.

3. Data management
Financial technology firms capture the same consumer data that community banks do, but typically are more effective in their ability to leverage and capitalize on it. Large banks are harnessing this data as well, but smaller banks may not have the resources to make the effort profitable. The data may be residing in multiple systems, or it may be inaccurate or outdated. But the ability to efficiently capture and leverage data will fundamentally change how banks go to market and drive profitability. Banks that want to capitalize on the consumer data they already have will need to make a strategic decision to imitate fintech companies or partner with them.

In addition, regulators are increasing reporting requirements for institutions as a way to conduct efficient, ongoing monitoring, which is fundamentally changing the way regulatory exams are conducted. Institutions that can leverage data for risk management purposes will see better regulatory outcomes and improved profitability.

Cybersecurity remains an immediate threat, and regulatory scrutiny in this area has ramped up accordingly. Banks need to have an effective data security plan in place to deal with the threats that come with gathering so much information.

4. Labor and workforce
A lack of qualified workers may constrain growth, service and innovation if not addressed from a strategic perspective. It has become particularly difficult to find qualified and well-trained credit analysts and compliance officers in a tight labor market. Given most community and regional banks’ focus on commercial real estate lending, the projected shortage of qualified appraisers presents a significant challenge. Banks in smaller markets are particularly challenged in attracting and retaining talent, but they can partially or completely outsource many responsibilities, such as those in information technology, asset liability management and regulatory compliance. A thoughtful approach to outsourcing can assist in maintaining core activities while allowing internal resources to focus on strategic activities.

5. Changing consumer preferences
As customers move to digital channels, acquiring new customers—and increasing wallet share of existing customers—should be top areas of strategic focus throughout 2018. Boomers are shifting from accumulating wealth to maintaining and preserving it, and as a group they prefer a mix of in-person and digital interactions. In an effort to retain these customers, many banks are modernizing branches into sleek, welcoming locations with a number of amenities and educating this demographic on the ability to leverage digital channels in a secure manner, while also responding to the digital demands of younger consumers.

Banks need to find the balance between the digital and the personal, and be agile enough to respond to these changing preferences.

Four Steps for Building an Effective Risk Appetite Framework


risk-appetite-12-7-16.pngRisk appetite is a key component of a bank’s risk management framework. Effective risk management is fundamental in ensuring there is an appropriate balance between risk and reward.

Good risk management does not involve avoiding risk at all costs. Instead, it allows taking on more risk as long as the bank is making informed choices and has measures in place to mitigate risks. Having a strong risk appetite statement and well established policies and procedures is important, but equally important is the effective implementation of this framework.

Based on discussions with a number of credit risk executives at small and large banks, we have identified four steps for implementing an effective credit risk framework.

1. Ensure data quality and integrity.
Clean, standardized data is essential to making fair, timely and accurate credit decisions. The bank also needs to see its complete exposure to ensure it’s not over-exposed at the time of origination.

Regulators are increasingly demanding that a solid risk governance framework include policies and processes to provide risk data aggregation and reporting capabilities. In order to accomplish this, banks should have the IT infrastructure to store data and support risk aggregation and reporting in order to capture material risks, concentrations and emerging risks in a timely manner.

Technology can significantly improve data quality and aggregation. Current systems offer a single source of truth and gather all the risk data in one system that is easy to view and access. So there’s no need for checking multiple systems, tracking exposure in spreadsheets, or adding up numbers. These systems can also aggregate exposures across products, industries, regions, and so forth.

2. Set appropriate limits.
At most banks, the limit-setting process falls to the risk management team. But setting limits is as much an art as a science in many institutions.

One way to ensure appropriate limits is to align compensation with risk culture and take an approach to limit setting that is well articulated, tied to business objectives, and clearly sets out the consequences of breaching limits. In addition, banks can leverage the funding and resources that have already been allocated to conduct regulatory stress testing to help set risk appetite limits.

We have worked with clients to define their risk appetite limits through a well defined analytical and quantitative approach. Ultimately, this approach can help risk management set appropriate limits, adjust limits as the market environment changes, obtain business buy-in, and improve the bank’s overall risk culture.

3. Implement and enforce limits.
Effective risk appetite can be thwarted by integration challenges between risk, business and other functional areas at banks. Lack of cultural alignment and faulty processes often prevent the risk appetite framework from being adopted by the business units at the point of origination, rendering it ineffective.

At many banks, the process is still manual–checking reports and spreadsheets to ensure compliance where automation could save time and increase accuracy. Solutions now exist that let bank officers see at the point of origination where a potential deal is going to breach risk appetite limits. At that point, before moving forward, the red flag is raised and originators can decide to continue the approval process and seek an exception, escalate it to management, or even decline the deal.

4. Monitor limits and manage breaches.
Identifying limit breaches and near-breaches in a timely manner is critical to a dynamic risk appetite monitoring process. Limits should be reviewed and updated frequently, as changes in market conditions, risk tolerance, strategy, or other factors arise. Having ready access to customer and portfolio data, and where various exposures stand against limits, is essential to make timely decisions.

Breaches must be identified as they occur, automatic alerts sent to the right decision-making individuals at the bank, and the breach and resolution must be well documented so it can be audited in the future. Manual calculation and spreadsheets cannot guarantee this; only a strong IT infrastructure with limits and management capabilities can achieve this desired state.

Conclusion: The Way Forward
Technology can deliver significant value to the overall risk appetite process. Automated systems provide efficiency gains, better data quality and enhanced analytics. And these factors, in turn, drive the ability to measure, monitor and adjust risk taken against established risk appetite.

For more on this topic, see our white paper.

Does Your Bank Have the Stress Testing Data You Need?


stress-testing-8-26-16.pngThe next several years will increase the need for better data management at banks. Banks that have experienced the Dodd-Frank Act’s required stress tests (DFAST) already have encountered that need. With the Basel III international accord phasing in and the new current expected credit loss impairment standard (CECL) eventually taking effect, all U.S. financial institutions are facing ever more demanding regulatory requirements driving the need for enhanced data and analytics capabilities.

Credit data is becoming increasingly integral to stress tests, as well as capital planning and management and credit loss forecasts. To meet regulatory expectations in these areas, though, some banks need to improve the quality of their data and the control they have over it. Effective data management can bring valuable support and efficiencies to a range of compliance activities.

Expanding Data Requirements
DFAST, which is required of banks above $10 billion in assets, is highly dependent on data quality. The DFAST process—including scenarios, analytics, and reporting—requires banks to maintain a vast array of reliable and detailed portfolio data, including data related to assets and liabilities; to customers, creditors and counterparties; to collateral; and to customer defaults.

Under Basel III, banks will need to gather even more data. The requirements call for consistent data sourcing and reconciliation, liquidity management and the capture of data for historical purposes, among other things.

The Financial Accounting Standards Board’s new CECL model for GAAP reporting applies to all banks and will bring implications for data management. Banks and financial services companies will need borrower and economic data, exposure level data, historical balances, risk ratings and data on charge-offs and recoveries. Failure to capture quality data in these and other areas could result in tougher examinations, reliance on peer or industry data, questions about safety and soundness and drops in capital and profits.

Data Management Challenges
Small banks generally have a handful of credit data sources, while large banks can have 15 or more—and the number of sources is expected to grow in coming years as new products are released. In addition, the data often is stored in different formats and might not be subject to any governance or control. It’s no wonder that banks can find it difficult to get a handle on their data, let alone produce a “single source of truth” that can withstand examiner scrutiny.

One solution to this dilemma is a credit data warehouse. A data warehouse can provide a vehicle for controlling and governing an immense amount of data. It allows a bank to easily show an examiner the data that was used for its models, the data sources and how the data reconciles with the bank’s financial statements.

Banks might encounter some obstacles on their way to effective warehousing, though, including the sheer volume of data to be stored. Quality assurance is another common issue. For example, information might be missing or not in a standardized format. Data availability also can pose problems. A bank might have the required information but not in an accessible format.

Best Practices
Overcoming these problems comes down to data governance—how a bank manages its data over time to establish and maintain the data’s trustworthiness. Data management without active governance that is targeted toward achieving a single source of truth isn’t sustainable.

In the case of DFAST, it’s important to resist the temptation to take a short-term perspective that considers only the data required for stress testing. Banks that take a more global view, bearing in mind that the data is used throughout the organization, will fare much better. Such banks build a framework that can handle the new data requirements (including those related to historical data) that will surely continue to come in the future.

Banks also should remember that data management is not a one-off task. A bank might have clean data today, but that data will degrade over time if not managed on an ongoing basis.

Finally, banks should not overlook the human factor. Success isn’t brought about by a database but by the people who are stewards for the data and the processes put in place to audit, balance, and control the data. The people and processes will, of course, be enabled with technology, but the people and processes will make or break a data management program.

Time to Take Control
Effective data management is an essential component of any stress testing endeavor, but data management also has implications that extend to CECL and Basel III compliance and likely will aid banks in coping with many forthcoming regulations and requirements. Banks that don’t yet have control of their data should take steps now to establish the governance, framework and people and processes necessary to ensure the completeness, accuracy, availability and auditability of a single source of truth for both the short- and long-term.

When It Comes To Bank Big Data, Start Small


Big_Data_2-11-13.pngAre you using the data you have to understand and target your marketplace and each customer need? The truth is that most banks today generate more data than they are capable of exploiting.

But does the instant availability of data, combined with less expensive and faster computing capability, make big data a competitive silver bullet or is it just the next shiny object that will distract us from the real business at hand? Bottom line… is more data actually better?

Being saddled with legacy siloed technology platforms, lacking analytical expertise and structured only to support traditional approaches to data usage, many financial institutions are finding they’re woefully unprepared for the challenges of working with big data (usually defined as data inside and outside the organization that is both structured and unstructured).

A better starting place for most banks is to start small, using a building-block approach to data management. This would address the most immediate hurdles facing banks today including: 1) improving the integrity of current data, 2) integrating multiple data silos, 3) leveraging real-time data, 4) improving accessibility of data, and 5) better analyzing data sets.

Improving Data Integrity
Before we expand our data inputs, we must make sure our existing database is complete and accurate. While names and addresses may be up to date, the same can’t usually be said for phone numbers, email addresses and preferred communication channels. >In addition, important information such as mobile phone numbers and services held at other institutions is usually not collected.

To move forward in the world of big data, we should first build a plan to update and backfill outdated and incomplete data files. This process starts on the front line, in our call centers and through customer surveys.

Integrating Data Silos
Most banks still have individual data silos for the retail consumer, small businesses, commercial accounts, the mortgage portfolio and possibly other credit services such as credit cards. Without an integrated platform, a fully functioning 360-degree view of our customers is impossible.

A common scenario occurs when banks don’t recognize small business or commercial relationships of retail customers. Breaking down silos between product lines and integrating the data should be done before any overarching big data initiative is considered.

Leveraging Real-Time Data
As more customers are using online and mobile channels, there is a need to leverage real-time data from both the bank and customer perspective. Yesterday’s data, while important for trend analysis, is much less valuable for risk analysis and marketing optimization.

Today’s customer expects all transactions to be reflected immediately as they use their cards, transfer funds online, and increasingly use their mobile devices to transact. Other industries have also made them accustomed to relevant offers, communicated using the right channel at the optimal time. To accomplish this in banking, we need to collect (and act on) real-time data.

Expanding Data Accessibility
Integrating accurate and complete real-time data is powerful only if it can be easily accessed and effectively analyzed across the organization. This will require a new operating model and approach to data management.

Since many banks are already dealing with data overload, the odds are not in our favor that more data will automatically improve results. But until all areas are seeing the same view of the customer, and can make business decisions based on insight available, the potential of big data will be lost.

Analyzing Data
Remember, more data doesn’t fix bad analysis. Progressive banks in the future will be engaging with customers in ways that were unforeseen only a few years ago. Retail banking will be operating faster as described in a recent blog post written by Scott Bales from Movenbank entitled, “Finding Serendipity in Big Data.”

Competitive advantage is achievable through the better analysis and use of customer data and big data definitely deserves to be part of our planning and strategy process. But banks should start small as opposed to boiling an ocean. Some starting steps include:

  • Use account level and transaction data to build life-stage trigger communication programs (new movers, retention, onboarding)
  • Leverage transaction data to improve risk and fraud monitoring
  • Review channel and transaction data to determine optimal branch reconfiguration (size, structure, support)
  • Use funds movement data to determine price elasticity of products and customer segments

Now is the time to improve the accuracy of data already stored, build real-time capabilities, break down existing data solos and improve the accessibility and analysis of data to ensure that the concept of big data doesn’t move towards the trough of disillusionment and lost opportunities.