Digitizing Documentation: The Missed Opportunity in Banking

To keep up in an increasingly competitive world, banks have embraced the need for digital transformation, upgrading their technology stacks to automate processes and harness data to help them grow and find operational efficiencies.

However, while today’s community and regional banks are increasingly making the move to digital, their documentation and contracting are still often overlooked in this transformation – and left behind. This “forgotten transformation” means their documentation remains analog, which means their processes also remain analog, increasing costs, time, data errors and risk.

What’s more, documentation is the key that drives the back-office operations for all banks. Everything from relationship management to maintenance updates and new business proposals rely on documents. This is especially true for onboarding new clients.

The Challenges of Onboarding
Onboarding has been a major focus of digital transformation efforts for many banks. While account opening has become more accessible, it also arguably requires more customer effort than ever. These pain points are often tied back to documentation: requesting multiple forms of ID or the plethora of financial details needed for background verification and compliance. This creates friction at the first, and most important, interaction with a new customer.

While evolving regulatory concerns in areas such as Know-Your-Customer rules as well as Bank Secrecy Act and anti-money laundering compliance have helped lower banks’ risks, it often comes at the expense of the customer experience. Slow and burdensome processes can frustrate customers who are accustomed to smoother experiences in other aspects of their digital lives.

The truth is that a customer’s perception of the effort required to work with a bank is a big predictor of loyalty. Ensuring customers have a quick, seamless onboarding experience is critical to building a strong relationship from the start, and better documentation plays a key role in better onboarding.

An additional challenge for many banks is that employees see onboarding and its associated documentation as a time consuming and complicated process from an operations perspective. It can take days or even weeks to onboard a new retail customer and for business accounts it can be much worse; a Deloitte report suggests it can take some banks up to 16 weeks to onboard a new commercial customer. Most often, the main problems in onboarding stem from backend processes that are manual when it comes to documentation, still being largely comprised of emails, word documents and repositories that sit in unrelated silos across an organization, collecting numerous, often redundant, pieces of data.

While all data can be important, better onboarding requires more collaboration and transparency between banks and their customers. This means banks should be more thoughtful in their approach to onboarding, ensuring they are using data from their core to the fullest to reduce redundant and manual processes and to make the overall process more streamlined. The goal is to maximize the speed for the customer while minimizing the risk for the bank.

Better Banking Through Better Documentation
Many banks do not see documentation as a data issue. However, by taking a data-driven approach, one that uses data from the core and feed backs into it, banks transform documents into data and, in turn, into an opportunity. Onboarding documents become a key component of the bank’s overall, end-to-end digital chain. This can have major impacts for banks’ operational efficiencies as well as bottom lines. In addition to faster onboarding to help build stronger customer relationships, a better documentation process means better structured data, which can offer significant competitive advantages in a crowded market.

When it comes to documentation capabilities, flexibility is key. This can be especially true for commercial customers. An adaptable solution can feel less “off the shelf” and provide the flexibility to meet individual client needs, while giving a great customer experience and maintaining regulatory guidelines. This can also provide community bankers with the ability to focus on what they do best, building relationships and providing value to their customers, rather than manually gathering and building documents.

While digitizing the documents is critical, it is in many ways the first step to a better overall process. Banks must also be able to effectively leverage this digitized data, getting it to the core, and having it work with other data sources.

Digital transformation has become an imperative for most community banks, but documentation continues to be overlooked entirely in these projects. Even discounting the operational impacts, documents ultimately represent the two most important “Rs” for banks – relationships and revenue, which are inextricably tied. By changing how they approach and treat client documentation, banks can be much more effective in not only the customer onboarding process, but also in responding to those customer needs moving forward, strengthening those relationships and driving revenue now and in the future.

Does Your Bank Struggle With Analysis Paralysis?

The challenge facing most community financial institutions is not a lack of data.

Institutions send millions of data points through extensive networks and applications to process, transmit and maintain daily operations. But simply having an abundance of data available does not automatically correlate actionable, valuable insights. Often, this inundation of data is the first obstacle that hinders — rather than helps — bankers make smarter decisions and more optimal choices, leading to analysis paralysis.

What is analysis paralysis? Analysis paralysis is the inability of a firm to effectively monetize data or information in a meaningful way that results in action.

The true value is not in having an abundance of data, but the ability to easily turn this cache into actionable insights that drive an institution’s ability to serve its community, streamline operations and ultimately compete with larger institutions and non-bank competitors.

The first step in combatting analysis paralysis is maintaining a single source of truth under a centralized data strategy. Far too often, different departments within the same bank produce conflicting reports with conflicting results — despite relying on the “same” input and data sources. This is a problem for several reasons; most significantly, it limits a banker’s ability to make critical decisions. Establishing a common data repository and defining the data structure and flow with an agreed-upon lexicon is critical to positioning the bank for future success.

The second step is to increase the trust, reliability, and availability of your data. We are all familiar with the saying “Garbage in, garbage out.” This applies to data. Data that is not normalized and is not agreed-upon from an organizational perspective will create issues. If your institution is not scrubbing collected data to make sure it is complete, accurate and, most importantly, useful, it is wasting valuable company resources.

Generally, bad data is considered data that is inaccurate, incomplete, non-conforming, duplicative or the result of poor data input. But this isn’t the complete picture. For example, data that is aggregated or siloed in a way that makes it inaccessible or unusable is also bad data. Likewise, data that fails to garner any meaning or insight into business practices, or is not available in a timely manner, is bad data.

Increasing the access to and availability of data will help banks unlock its benefits. Hidden data is the same as having no data at all.

The last step is to align the bank’s data strategy with its business strategy. Data strategy corresponds with how bank executives will measure and monitor the success of the institution. Good data strategy, paired with business strategy, translates into strong decision-making. Executives that understand the right data to collect, and anticipate future expectations to access and aggregate data in a meaningful way is paramount to achieving enduring success in this “big data” era. For example, the success of an initiative that takes advantage of artificial intelligence (AI) and predictive capabilities is contingent upon aligning a bank’s data strategy with its business strategy.

When an organization has access to critical consumer information or insights into market tendencies, it is equipped to make decisions that increase revenue, market share and operational efficiencies. Meaningful data that is presented in a timely and easy-to-digest manner and aligns with the company’s strategy and measurables allows executives to react quickly to changes affecting the organization — rather than waiting until the end of the quarter or the next strategic planning meeting before taking action.

At the end of the day, every institution’s data can tell a very unique story. Do you know what story your data tells about the bank? What does the data say about the future? Banks that are paralyzed by data lose the ability to guide their story, becoming much more reactive than proactive. Ultimately, they may miss out on opportunities that propel the bank forward and position it for future success. Eliminating the paralysis from the analysis ensures data is driving the strategy, and enables banks to guide their story in positive direction.

Taking Model Risk Management to the Next Level

A financial institution’s data is one of its most valuable resources. Banks constantly collect data on their loans, deposits and customer behaviors. This data should play a key role in how financial intuitions manage their risks.

Yet, developing a data strategy can be seen as too complex based on the sheer amount of data an institution may have, or as an unnecessary burden if the objective is solely to use the information to satisfy regulatory requirements. But a holistic data strategy can enhance value across all model risk management (MRM) platforms, both for regulatory and strategic purposes. On the flip side, being inconsistent or not updating data and inputs in a timely manner can lead to inaccurate or inconsistent results. Executives need to continually update and review information for consistency; if not, the information’s relevancy in assessing risk across various platforms will decrease.

Currently, the most common data strategy approach for banks is using individual tools to measure risk for regulatory purposes. For instance, financial institutions are required to calculate and monitor interest rate risk related to their balance sheet and potential movements in future interest rates. Typically, one team within the institution extracts data and transfers it to another team, which loads the data into an internal or external model to calculate the various interest rate profiles for management to analyze and make decisions. The institution repeats this process for its other models (credit, capital adequacy, liquidity, budgeting, etc.), adjusting the inputs and tools as needed. Often, banks view these models as individual silos — the teams responsible for them, and the inputs and processes, are separate from one another. However, the various models used to measure risk share many commonalities and, in many aspects, are interdependent.

Integrating model risk management processes require understanding a bank’s current data sources and aggregation processes across all of its current models. The first step for executives is to understand what data is currently used across these platforms, and how your organization can utilize it other beyond just checking the regulatory box. In order to enhance data quality, can one data extract be used for multiple platforms? For example, can the same loan-level data file be used for different models that use similar inputs such as asset liability management (ALM) and certain CECL models? While models may utilize some different or additional fields and inputs, there are many fields — such as contractual data or loan prepayment assumptions — that are consistent across models. Extracting the data once and using it for multiple platforms allows institutions to minimize the risk of inaccurate or faulty data.

From here, bank executives can develop a centralized assumption set that can be modeled across all platforms to ensure consistency and align results between models. For instance, are the credit assumptions that are developed for CECL purposes consistent with those used to calculate your ALM and liquidity profile under various scenarios? Are prepayment assumptions generated within the ALM model also incorporated into your CECL estimate? Synchronizing assumptions can provide more accurate and realistic results across all platforms. The MRM dashboard is a tool that can be configured to alert bank executives of emerging risks and ensure that data shared by different models is consistent.

One common method of gaining insights using MRM is through scenario and stress testing. Today’s environment is uncertain; executives should not make future decisions without in-depth analysis. They can develop scenarios for potential growth opportunities, modeling through the integrated platforms to calculate impacts to profitability and credit and interest rate risk. Similarly, they can expand deposit data and assumptions to assess high-risk scenarios or future liquidity issues apart from normal day-to-day operations. Whatever the strategy may be, assessing risk on an integrated basis allows management to gain a better understanding of all impacts of future strategies and make stronger business decisions.

Once institutions begin centralizing their data and model inputs and streamlining their monitoring processes using MRM dashboards, management can shift their focus to value-added opportunities that go beyond compliance and support the strategic vision of the institution.