The $700 Billion Credit Question for Banks

It’s the $700 billion question: How bad could it get for banks?

That’s the maximum amount of losses that the Federal Reserve modeled in a special sensitivity analysis in June for the nation’s 34 largest banks over nine quarters as part of its annual stress testing exercise.

Proportional losses could be devastating for community banks, which also tend to lack the sophisticated stress testing models of their bigger peers and employ a more straight-forward approach to risk management. Experts say that community banks should draw inspiration from the Fed’s analysis and broad stress-testing practices to address potential balance sheet risk, even if they don’t undergo a full stress analysis.

“It’s always good to understand your downsides,” says Steve Turner, managing director at Empyrean Solutions, an asset and liability tool for financial institutions. “Economic environments do two things: They tend to trend and then they tend to change abruptly. Most people are really good at predicting trends, very few are good at forecasting the abrupt changes. Stress testing provides you with insight into what could be the abrupt changes.”

For the most part, stress testing, an exercise that subjects existing and historical balance sheet data to a variety of adverse macroeconomic outlooks to create a range of potential outcomes, has been the domain of the largest banks. But considering worst-case scenarios and working backward to mitigate those outcomes — one of the main takeaways and advantages of stress testing — is “unequivocally” part of prudent risk and profitability management for banks, says Ed Young, senior director and capital planning strategist at Moody’s Analytics.

Capital & Liquidity
The results of the Fed’s sensitivity analysis underpinned the regulator’s decision to alter planned capital actions at large banks, capping dividend levels and ceasing most stock repurchase activity. Young says bank boards should look at the analysis and conclusion before revisiting their comfort levels with “how much capital you’re letting exit from your firm today” through planned distributions.

Share repurchases are relatively easy to turn on and off; pausing or cutting a dividend could have more significant consequences. Boards should also revisit the strategic plan and assess the capital intensity of certain planned projects. They may need to pause anticipated acquisitions, business line additions and branch expansions that could expend valuable capital. They also need to be realistic about the likelihood of raising new capital — what form and at what cost — should they need to bolster their ratios.

Boards need to frequently assess their liquidity position too, Young says. Exercises that demonstrate the bank can maintain adequate capital for 12 months mean little if sufficient liquidity runs out after six months.

Credit
When it comes to credit, community banks may want to start by comparing the distribution of the loan portfolios of the banks involved in the exercise to their own. These players are active lenders in many of the same areas that community banks are, with sizable commercial and industrial, commercial real estate and mortgage portfolios.

“You can essentially take those results and translate them, to a certain degree, into your bank’s size and risk profile,” says Frank Manahan, a managing director in KPMG’s financial services practice. “It’s not going to be highly mathematical or highly quantitative, but it is a data point to show you how severe these other institutions expect it to be for them. Then, on a pro-rated basis, you can extract information down to your size.”

Turner says many community banks could “reverse stress test” their loan portfolios to produce useful insights and potential ways to proceed as well as identify emerging weaknesses or risks.

They should try to calculate their loss-absorbing capacity if credit takes a nosedive, or use a tiered approach to imagine if something “bad, really bad and cataclysmic” happens in their market. Credit and loan teams can leverage their knowledge of customers to come up with potential worst-case scenarios for individual borrowers or groups, as well as what it would mean for the bank.

“Rather than say, ‘I project that a worst-case scenarios is X,’ turn it around and say, ‘If I get this level of losses in my owner-occupied commercial real estate portfolio, then I have a capital problem,’” Turner says. “I’ll have a sense of what actions I need to take after that stress test process.”

A key driver of credit problems in the past has been the unemployment rate, Manahan says. Unemployment is at record highs, but banks can still leverage their historical experience of credit performance when unemployment hit 9.5% in June 2009.

“If you’ve done scenarios that show you that an increase in unemployment from 10% to 15% will have this dollar impact on the balance sheet — that is a hugely useful data point,” he says. “That’s essentially a sensitivity analysis, to say that a 1 basis point increase in unemployment translates into … an increase in losses or a decrease in revenue perspective to the balance sheet.”

After identifying the worst-case scenarios, banks should then tackle changing or refining the data or information that will serve as early-warning indicators. That could be a drawdown of deposit accounts, additional requests for deferrals or changes in customer cash flow — anything that may indicate eventual erosion of credit quality. They should then look for those indicators in the borrowers or asset classes that could create the biggest problems for the bank and act accordingly.

Additional insights

  • Experts and executives report that banks are having stress testing conversations monthly, given the heightened risk environment. In normal times, Turner says they can happen semi-annual.
  • Sophisticated models are useful but have their limits, including a lack of historical data for a pandemic. Young points out that the Fed’s sensitivity analysis discussed how big banks are incorporating detailed management judgement on top of their loss models.
  • Vendors exist to help firms do one-time or sporadic stress tests of loan portfolios against a range of potential economic forecasts and can use publicly available information or internal data. This could be an option for firms that want a formal analysis but don’t have the time or money to implement a system internally.
  • Experts recommend taking advantage of opportunities, like the pandemic, to enhance risk management and the processes and procedures around it.

Recalibrating Bank Stress Tests to a New Reality

Any bank that stress tested its loan portfolios prior to the Covid-19 pandemic probably used a worst-case scenario that wasn’t nearly as bad as the economic reality of the last five months.

Stress tests are an analysis of a bank’s loans or revenue stream against a variety of adverse computer-generated scenarios. The results help management teams and their boards of directors gauge whether the bank has adequate reserves and capital to withstand loan losses of various magnitudes. One challenge for banks today that incorporate stress tests into their risk management approach is the lack of relevant historical data. There is little modern precedent for what has befallen the U.S. economy since March, when most of the country went into lockdown to try to flatten the pandemic’s infection rate. The shutdowns tipped the U.S. economy into its steepest decline since the Great Depression.

Does stress testing still have value as a risk management tool, given that we’re navigating in uncharted economic waters?

“I would argue absolutely,” says Jay Gallagher, deputy comptroller for systemic risk identification support and specialty supervision at the Office of the Comptroller of the Currency. “It is not meant to be an exercise in perfection. It’s meant to say within the realm of possibility, these are the scenarios or variables we want to test against. Could we live with what the outcome is?”

The Dodd-Frank Act required banks with assets of $10 billion or greater to run annual stress tests, known as DFAST tests, and report the results to their primary federal regulator. The requirement threshold was raised to $100 billion in 2018, although Gallagher believes that most nationally chartered banks supervised by the OCC still do some form of stress testing.

They see value in the exercise and not having the regulatory framework around it makes it even more nimble for them to focus on what’s really important to them as opposed to checking all the boxes from a regulatory exercise,” says Gallagher. “We still see a lot of banks that used to have to do DFAST still use a lot of the key tenets in their risk management programs.”

Amalgamated Bank, a $5.8 billion state chartered bank headquartered in New York, has been stress testing its loan portfolios on an individual and macro level for several years even though it sits well below the regulatory threshold. For the first time ever, the bank decided to bring in an outside firm to do its own analysis, including peer comparisons.

President and CEO Keith Mestrich says it is as much a business planning tool as much as it is a risk mitigation tool. It gives executives insight into its loan mix and plays an important role in decisions that Amalgamated makes about credit and capital.

It tells you, are you going to have enough capital to withstand a storm if the worst case scenario comes true and we see these loss rates,” he says. “And if not, do you need to go out and raise additional capital or take some other measures to get some risk off the balance sheet, even if you take a pretty significant haircut on it?”

Banks that stress test have been forced to recalibrate and update their economic assumptions in the face of the economy’s sharp decline, as well as the government’s response. The unemployment rate spiked to 14.7% in April before dropping to 11.1% in June when the economy began to reopen, according to the Bureau of Labor Statistics. But the number of Covid-19 cases in the U.S. has surged past 3 million and several Western and Southern states are experiencing big increases in their infection rates, raising the possibility that unemployment might spike again if businesses are forced to close for a second time.

“I feel like the unemployment numbers are probably the most important ones, but they’re always set off by how the Covid cases go,” says Rick Childs, a partner at the consulting firm Crowe. “To the extent that we don’t get [the virus] back under control, and it takes longer to develop a vaccine and/or effective treatment options for it, I think they’ll always be in competition with each other.”

Another significant difference between the Great Recession and the current situation is the unparalleled level of fiscal support the U.S. Congress has provided to businesses, local governments and individuals through the $2 trillion CARES Act. It is unclear another round of fiscal support will be forthcoming later this year, which could also drive up the unemployment rate and lead to more business failures. These and other variables complicate the process of trying to construct a stress test model, since there aren’t clear precedents to rely on in modern economic history.

Stress testing clearly still has value despite these challenges, but Childs says it’s also important that banks stay close to their borrowers. “Knowing what’s happening with your customer base is probably going to be more important in terms of helping you make decisions,” he says.

Why Asset Size Does Not Matter To Regulators In ERM


ERM-1-21-19.pngConventional wisdom in banking has been that asset size matters in terms of regulatory expectations around enterprise risk management (ERM).

But that traditional school of thought might be changing. A new question has emerged: is it the institution’s asset size that matters, or is the complexity of the risk profile more important?

A common question among peer roundtables: what is a bank expected to do for ERM as it approaches the $10 billion asset size threshold of a regional banking organization (RBO)? The Federal Reserve considers an RBO to have total consolidated between $10 billion and $50 billion.

The next question typically is if regulatory expectations have lessened around comprehensive capital analysis and review (CCAR) or Dodd-Frank Act Stress Test (DFAST) requirements because of recent reforms in Congress?

These are hot topics especially for banks below the $10 billion asset size bubble, known as community bank organizations (CBO) by the Fed, because the cost of ERM implementation remains high.

Specific to CBOs between $2 billion and $5 billion in assets, regulatory agencies have been providing more prescriptive guidance and recommendations to upgrade and enhance ERM and model risk management frameworks consistent with existing regulatory guidance aimed at RBOs.

Examinations are more detailed, covering policies and procedures, personnel, risk appetite, risk assessment activities and board reporting. Examiners are pushing smaller banks to recognize the ERM value proposition because a keen risk awareness will inspire more informed decisions.

An effective ERM program starts with the risk culture necessary for appropriate governance of policies and procedures, risk awareness training, tone from the top and credible challenge. The culture should start with the CEO and the board establishing a proactive risk strategy and aligning the risk appetite of the bank with strategic planning.

Implementing an effective risk management program is understanding your bank’s risk profile and addressing matters proactively, having the discipline to identify emerging risks and mitigating those risks before a risk event or loss.

As banks approach $10 billion in assets, they are expected to increase the rigor around risk identification and assess risks for their likelihood and impact before identifying risk-mitigating controls.

A CBO should have a champion to effect change strategically throughout the organization, rather than a regulatory or audit check-the-box exercise. The risk management champion can be compared to an orchestra conductor who does not need to do everyone else’s job but should be able to hear someone is out of tune. Breaking down silos is key because risk management should be a continuous, collaborative process involving all stakeholders.

Regulatory expectations are converging as examiners push smaller banks to show a safe and sound risk management framework. This should encompass a separate board risk committee, or, at a minimum, a subcommittee responsible for ERM.

All banks have traditionally been expected to maintain appropriate risk management processes commensurate with their size and complexity and operate in a safe and sound manner.

The formality and documentation required is a new, evolving trend. Board and senior management oversight is important, as is risk monitoring and information system reporting. Board support is critical to understand risk areas, develop training programs and establish accountability among leadership and risk management team members.

Regulatory scrutiny for banks below $10 billion of assets has increased for ERM sub-processes, including model risk management, new products and services and third-party risk management.

We live in a post-CCAR world trending toward deregulation; however, the regulatory burden of risk management expectations for the smaller CBOs is increasing. Essentially, asset size does not matter anymore.

Well Conceived and Executed Bank Acquisitions Drive Shareholder Value


acquisition-2-21-18.pngRecent takeovers among U.S.-based banks generally have resulted in above-market returns for acquiring banks, compared to their non-acquiring peers, according to KPMG research. This finding held true for all banks analyzed except those with greater than $10 billion in assets, for which findings were not statistically significant.

Our analysis focused on 394 U.S.-domiciled bank transactions announced between January 2012 and October 2016. Our study focused on whole-bank acquisitions and excluded thrifts, acquisitions of failed banks and government-assisted transactions. The analysis yielded the following conclusions:

  • The market rewards banks for conducting successful acquisitions, as evidenced by higher market valuations post-announcement.
  • Acquiring banks’ outperformance, where observable, increased linearly throughout our measurement period, from 90 days post-announcement to two years post-announcement.
  • The positive effect was experienced throughout the date range examined.
  • Banks with less than $10 billion in assets experienced a positive market reaction.
  • Among banks with more than $10 billion in assets, acquirers did not demonstrate statistically significant differences in market returns when compared to banks that did not conduct an acquisition.

Factors Driving Value

Bank size. Acquiring banks with total assets of between $5 billion and $10 billion at the time of announcement performed the strongest in comparison with their peers during the period observed. Acquiring banks in this asset range outperformed their non-acquiring peers by 15 percentage points at two years after the transaction announcement date, representing the best improvement when compared to peers of any asset grouping and at any of the timeframes measured post-announcement.

Performance-chart.png 

Acquisitions by banks in the $5 billion to $10 billion asset range tend to result in customer expansion within the acquirer’s market or a contiguous market, without significant increases in operational costs.

We believe this finding is a significant factor driving the value of these acquisitions. Furthermore, banks that acquire and remain in the $10 billion or less asset category do not bear the expense burden associated with Dodd-Frank Act stress testing (DFAST) compliance.

Conversely, banks with nearly $10 billion in assets may decide to exceed the regulatory threshold “with a bang” in anticipation that the increased scale of a larger acquisition may serve to partially offset the higher DFAST compliance costs.

The smaller acquiring banks in our study—less than $1 billion in assets and $1 billion to $5 billion in assets—also outperformed their peers in all periods post-transaction (where statistically meaningful). Banks in these asset ranges benefited from some of the same advantages mentioned above, although they may not have received the benefits of scale and product diversification of larger banks.

As mentioned earlier, acquirers with greater than $10 billion in assets did not yield statistically meaningful results in terms of performance against peers. We believe acquisitions by larger banks were less accretive due to the relatively smaller target size, resulting in a less significant impact.

Additionally, we find that larger bank transactions can be complicated by a number of other factors. Larger banks typically have a more diverse product set, client base and geography than their smaller peers, requiring greater sophistication during due diligence. There is no substitute for thorough planning, detailed due diligence and an early and organized integration approach to mitigate the risks of a transaction. Furthermore, alignment of overall business strategy with a bank’s M&A strategy is a critical first step to executing a successful acquisition (or divestiture, for that matter).

Time since acquisition. All three acquirer groups that yielded statistically significant results demonstrated a trend of increasing returns as time elapsed from transaction announcement date. The increase in acquirers’ values compared to their peers, from the deal announcement date until two years after announcement, suggests that increases in profitability from income uplift, cost reduction and market expansion become even more accretive with time.

Positive performance pre-deal may preclude future success. Our research revealed a positive correlation between the acquirer’s history of profitability and excess performance against peers post-acquisition. We noted this trend in banks with assets of less than $1 billion, and between $1 billion and $5 billion, at the time of announcement.

This correlation suggests that banks that were more profitable before a deal were increasingly likely to achieve incremental shareholder value through an acquisition.

Bank executives should feel comfortable pursuing deals knowing that the current marketplace rewards M&A in this sector. However, our experience indicates that in order to be successful, acquirers should approach transactions with a thoughtful alignment of M&A strategy with business strategy, an organized and vigilant approach to due diligence and integration, and trusted advisers to complement internal teams and ensure seamless transaction execution.

Using Big Data to Assess Credit Quality for CECL


CECL-4-7-17.pngThe new Financial Accounting Standards Board (FASB) rules for estimating expected credit losses presents banks with a wide variety of challenges as they work toward compliance.

New Calculation Methods Require New Data
The new FASB standard replaces the incurred loss model for estimating credit losses with the new current expected credit loss (CECL) model. Although the new model will apply to many types of financial assets that are measured at amortized cost, the largest impact for many lenders will be on the allowance for loan and lease losses (ALLL).

Under the CECL model, reporting organizations will make adjustments to their historical loss picture to highlight differences between the risk characteristics of their current portfolio and the risk characteristics of the assets on which their historical losses are based. The information considered includes prior portfolio composition, past events that affected the historic loss, management’s assessment of current conditions and current portfolio composition, and forecast information that the FASB describes as reasonable and supportable.

To develop and support the expected credit losses and any adjustments to historical loss data, banks will need to access a wider array of data that is more forward-looking than the simpler incurred loss model.

Internal Data Inventory: The Clock is Running
Although most of the data needed to perform these various pooling, disclosure and expected credit loss calculations can be found somewhere, in some form, within most bank’s systems, these disparate systems generally are not well integrated. In addition, many data points such as customer financial ratios and other credit loss characteristics are regularly updated and replaced, which can make it impossible to track the historical data needed for determining trends and calculating adjustments. Other customer-specific credit loss characteristics that may be used in loan origination today might not be updated to enable use in expected credit loss models in the future.

Regardless of the specific deadlines that apply to each type of entity, all organizations should start capturing and retaining certain types of financial asset and credit data. These data fields must be captured and maintained permanently over the life of each asset in order to enable appropriate pooling and disclosure and to establish the historical performance trends and loss patterns that will be needed to perform the new expected loss calculations. Internal data elements should focus on risks identified in the portfolio and modeling techniques the organization finds best suited for measuring the risks.

External Economic Data
In addition to locating, capturing, and retaining internal loan portfolio data, banks also must make adjustments to reflect how current conditions and reasonable and supportable forecasts differ from the conditions that existed when the historical loss information was evaluated.

A variety of external macroeconomic conditions can affect expected portfolio performance. Although a few of the largest national banking organizations engage in sophisticated economic forecasting, the vast majority of banks will need to access reliable information from external sources that meet the definition of “reasonable and supportable.”

A good place to start is by reviewing the baseline domestic macroeconomic variables provided by the Office of the Comptroller of the Currency (OCC) for Comprehensive Capital Analysis and Review (CCAR) and Dodd-Frank stress testing (DFAST) purposes. Because regulators use these variables to develop economic scenarios, these variables would seem to provide a reasonable starting point for obtaining potentially relevant historic economic variables and considerations from the regulatory perspective of baseline future economic conditions.

Broad national metrics—such as disposable income growth, unemployment, and housing prices—need to be augmented by comparable local and regional indexes. Data from sources such as the Federal Deposit Insurance Corporation’s quarterly Consolidated Report of Condition and Income (otherwise known as the call report) and Federal Reserve Economic Data (FRED), maintained by the Federal Reserve Bank of St. Louis, also can be useful.

Data List for CECL Compliance

critical-internal-data-elements-small.png

Looking Beyond Compliance
The new FASB reporting standard for credit losses will require banks to present expected losses in a timelier manner, which in turn will provide investors with better information about expected losses. While this new standard presents organizations of all sizes with some significant initial compliance challenges, it also can be viewed as an opportunity to improve performance and upgrade management capabilities.

By understanding the current availability and limitations of portfolio data and by improving the reliability and accuracy of various data elements, banks can be prepared to manage their portfolios in a way that improves income and maximizes capital efficiency.

Does Your Bank Have the Stress Testing Data You Need?


stress-testing-8-26-16.pngThe next several years will increase the need for better data management at banks. Banks that have experienced the Dodd-Frank Act’s required stress tests (DFAST) already have encountered that need. With the Basel III international accord phasing in and the new current expected credit loss impairment standard (CECL) eventually taking effect, all U.S. financial institutions are facing ever more demanding regulatory requirements driving the need for enhanced data and analytics capabilities.

Credit data is becoming increasingly integral to stress tests, as well as capital planning and management and credit loss forecasts. To meet regulatory expectations in these areas, though, some banks need to improve the quality of their data and the control they have over it. Effective data management can bring valuable support and efficiencies to a range of compliance activities.

Expanding Data Requirements
DFAST, which is required of banks above $10 billion in assets, is highly dependent on data quality. The DFAST process—including scenarios, analytics, and reporting—requires banks to maintain a vast array of reliable and detailed portfolio data, including data related to assets and liabilities; to customers, creditors and counterparties; to collateral; and to customer defaults.

Under Basel III, banks will need to gather even more data. The requirements call for consistent data sourcing and reconciliation, liquidity management and the capture of data for historical purposes, among other things.

The Financial Accounting Standards Board’s new CECL model for GAAP reporting applies to all banks and will bring implications for data management. Banks and financial services companies will need borrower and economic data, exposure level data, historical balances, risk ratings and data on charge-offs and recoveries. Failure to capture quality data in these and other areas could result in tougher examinations, reliance on peer or industry data, questions about safety and soundness and drops in capital and profits.

Data Management Challenges
Small banks generally have a handful of credit data sources, while large banks can have 15 or more—and the number of sources is expected to grow in coming years as new products are released. In addition, the data often is stored in different formats and might not be subject to any governance or control. It’s no wonder that banks can find it difficult to get a handle on their data, let alone produce a “single source of truth” that can withstand examiner scrutiny.

One solution to this dilemma is a credit data warehouse. A data warehouse can provide a vehicle for controlling and governing an immense amount of data. It allows a bank to easily show an examiner the data that was used for its models, the data sources and how the data reconciles with the bank’s financial statements.

Banks might encounter some obstacles on their way to effective warehousing, though, including the sheer volume of data to be stored. Quality assurance is another common issue. For example, information might be missing or not in a standardized format. Data availability also can pose problems. A bank might have the required information but not in an accessible format.

Best Practices
Overcoming these problems comes down to data governance—how a bank manages its data over time to establish and maintain the data’s trustworthiness. Data management without active governance that is targeted toward achieving a single source of truth isn’t sustainable.

In the case of DFAST, it’s important to resist the temptation to take a short-term perspective that considers only the data required for stress testing. Banks that take a more global view, bearing in mind that the data is used throughout the organization, will fare much better. Such banks build a framework that can handle the new data requirements (including those related to historical data) that will surely continue to come in the future.

Banks also should remember that data management is not a one-off task. A bank might have clean data today, but that data will degrade over time if not managed on an ongoing basis.

Finally, banks should not overlook the human factor. Success isn’t brought about by a database but by the people who are stewards for the data and the processes put in place to audit, balance, and control the data. The people and processes will, of course, be enabled with technology, but the people and processes will make or break a data management program.

Time to Take Control
Effective data management is an essential component of any stress testing endeavor, but data management also has implications that extend to CECL and Basel III compliance and likely will aid banks in coping with many forthcoming regulations and requirements. Banks that don’t yet have control of their data should take steps now to establish the governance, framework and people and processes necessary to ensure the completeness, accuracy, availability and auditability of a single source of truth for both the short- and long-term.

Lessons Learned From the Stress Tests


Stress-testing-9-24-15.pngIn the wake of the implementation of the Dodd-Frank Wall Street Reform and Consumer Protection Act stress test (DFAST) regulations, the term “stress test” has become a familiar part of the banking lexicon. The DFAST regulations require midsize banks—those with assets between $10 billion and $50 billion—to project the expected impact of three economic scenarios—baseline, adverse, and severely adverse—on their financial statements and capital ratios. Midsize financial institutions were required to report this year’s stress test results to their regulators by March 31, 2015, the second round of stress tests required for these banks.

Although the submission that was due in March was round two, most banks felt that it demanded just as much effort as the first round of stress tests.  Regulators focused more on process than results in round one and clearly stated that what was acceptable in the first submission would be insufficient for subsequent examinations. Little formal feedback is in so far, but what we have heard indicates that continuous improvement was definitely expected.

Model Mechanics
In the first round, most banks either used simplistic models or projections that did not capture their risks fully. Banks now are expected to develop enhanced models, and more significant portfolios are being modeled using bottom-up rather than top-down approaches. In assessing models, regulators are questioning assumptions and methodologies and looking for well documented, sound conceptual bases for the modeling choices made. Overly manual modeling processes also are being flagged as impractical for ad hoc use. The message is loud and clear: stress testing models are expected to be integrated into risk management practices.

Documentation
One common area for continued attention appears to be documentation. Whether it’s better organizing information to make it easier to follow the bank’s processes, improving validation documentation, writing user procedures, or better documenting the effective challenge process, the feedback received thus far reinforces that DFAST truly is a formal process. The documentation has to be sufficient for banks to manage, monitor and maintain the overall stress testing program. It also needs to be detailed enough to allow other users, including validators and regulators, to clearly understand the process.

Validation
Validation continues to be a big area of focus, and attention is being paid to both the timing and extent of validation activities. Timing is a critical review point, as the models are expected to be validated prior to the final stress test exercise. Validations have been criticized for having incomplete documentation, for failing to assess data lineage and quality, and for not being comprehensive. As modeling systems become more sophisticated, validations need to provide broader coverage. Validators—whether internal or third-party resources—must be experienced and competent, and they must deliver a sound validation in accordance with the agreed scope.

Sustainability
Banks have been encouraged to shore up organizational structures and procedures to keep their stress testing programs up-to-date and intact. With competition for quantitative resources at an all-time high, many are making choices about hiring statistical specialists and using contractors to keep on track. Banks are focusing on more automated processes, broader business participation, and more detailed user procedures to make sure the loss of one or two employees does not cause a program to fall apart completely.

Life in the DFAST Lane
As with most important business processes, effective DFAST risk management requires significant input from business management, risk management, and internal audit. A collaborative relationship among these three lines of defense results in the strongest DFAST processes. With reporting deadlines for the next cycle in 2016 being delayed from March 31 to July 31, banks have a bit of breathing room to assess the effectiveness and efficiency of their DFAST programs. Banks should use this extra time to further develop documentation, address highest priority issues, and continue to integrate stress testing into routine risk management practices.