How APIs Changed Bank Security From Castles to Strip Malls

The application programming interface, or API, has become one of the principal building blocks of the modern digital economy.

APIs are at the center of modern application architectures and system design; over 90% of the world’s internet traffic passes through them. For banks, APIs are the conduit that connects institutions to customers, partners and each other. Their responsiveness and agility drives innovation while dramatically lowering the cost of application development and integration. In addition to remaining competitive and innovative, API adoption and the microservices that use them are key to addressing regulatory requirements for open banking, such as PSD2.

But there’s a problem, and it’s a familiar one: With new technology, adoption tends to outpace security. It is the same situation with each new generation of tech solutions: adoption leads and security lags a few steps behind. This is happening right now with APIs — and banks should be anxious.

But anxiety is frequently taking second place to blissful ignorance. Most chief information security officers in financial services don’t understand the full implications of the API economy and are not measuring their institution’s exposure to its risks. It’s a major blind spot; you can’t secure that which you can’t see or don’t understand. Within the same organization, the CISO and the DevOps (a portmanteau of “software development” and “IT operations”) team commonly don’t really speak the same language. DevOps teams can often see the issues, but CISOs are not as alert to them. This gap in understanding means many back-end systems and critical infrastructure could be exposed to a cyberattack.

Regulators and auditors are catching up on the implications of architectural and attack surface changes and have yet to update their audit or examination methodologies. They are learning what questions to ask, from the basics around API ownership to detailed metrics and matters of governance around operating effectiveness. But some are still hazy about what an API even is, let alone why it might pose a problem.

Complexity is the Enemy of Security
Part of the difficulty is the complexity of legacy IT stacks that most banks are sitting on. Unfortunately, there’s rarely a financial return on a bank retiring a mainframe or decommissioning legacy systems. This leads to IT and security teams being responsible for maintaining and securing several generations of different technology, from ancient “Big Iron” mainframes and AS/400s right up to modern, fully API-driven digital banking platforms. The discipline and focus on system life cycle management is a continuous challenge. Migrations are complex and you can’t just switch off old systems — that’s like changing a plane’s engine mid-flight.

The complexity of securing and managing these multiple generations is immense; most banks don’t have the skills or resources in-house to do it. Complexity is the enemy of security, and the complexity of modern computing environments is only increasing. For chief information officers, chief technology officers and CISOs, it’s difficult enough to keep up with the current complexity. While Gartner estimates that APIs will become the No. 1 attack vector this year, API threats are just coming onto the radar as an area of focus.

Why are APIs at Risk?
The benefits of migrating to an API-first, microservice-based architecture are so strong that their adoption is inevitable. The advantages of APIs are manifest in terms of being able to easily collaborate with other companies, share data and simplify all kinds of integrations that weren’t possible before. The problem is that yesterday’s security model wasn’t built for this architecture.

Think of the security model for old monolithic web applications like a castle with a moat: a gate with a drawbridge that means one way in, one way out. Your services sit behind your defenses inside the castle. The microservice and API-first model changes that attack surface to more of a strip mall: external doors on each store. Data is highly distributed, but the castle’s security methods were never designed to monitor or protect this approach.

In many cases, security teams are not even aware that this is happening; this is even more common where there are third parties involved. Third-party dependencies are very common in banking, and it’s extraordinarily difficult to get appropriate visibility into critical supply chain vendors that banks rely on for key operational and control processes.

All this begs the question: If the adoption of APIs is inevitable, how will banks manage them safely? Security programs must evolve to address API challenges directly. Without management and appropriate security designed for them, APIs expose your bank’s essential data while the “guards at the front gate” have no idea what’s going on.

Banking’s Single Pane of Glass

Imagine looking at all the elements and complexities of a given business through a clear and concise “single pane of glass: one easily manageable web interface that has the horizontal capability to do anything you might need, all in one platform.”

It may sound too good to be true, but “single pane of glass” systems could soon become a reality within the mortgage industry. Underwriters, processors, loan originators and others who work at a mortgage or banking institution in other capacities must manage and maintain a plethora of different third-party software solutions on a daily basis.

It’s complex to simultaneously balance dozens of vendor solutions to monitor services, using different management console reports and processes for each. This cumbersome reality is one of the most significant challenges bankers face.

There are proven solutions and approaches to rationalizing these operational processes and streamlining interactions with customers, clients and new accounts. In the parlance of a technologist, these are called “single panes of glass,” better understood as multiple single panes of glass.

That does exist if you’re talking about a single product. Herein lies the problem. Heterogenous network users are using single third-party platform solutions for each service they need, with a result that one would expect. Too many single panes of glass — so much so that each becomes its own unique glass of pain.

How can banks fix this problem? Simply put, people need a single view of their purposed reality. Every source of information and environment, although different, needs to feed into a single API (application program interface). This is more than possible if banks use artificial intelligence and machine learning programs and API frameworks that are updated to current, modern standards. They can unify everything.

Ideally, one single dashboard would need to be able to see everything; this dashboard wouldn’t be led by vendors but would be supported by a plethora of APIs. Banks could plug that into an open framework, which can be more vendor-neutral, and you now have the option to customize and send data as needed.

The next hurdle the industry will need to overcome is that the panes of glass aren’t getting any bigger. Looking at pie charts and multiple screens and applications can be a real pain; it can feel like there isn’t a big enough monitor in the world to sift through some data spreadsheets and dashboards effectively.

With a “single pane of glass” approach, banks don’t have to consolidate all data they need. Instead, they can line up opportunities and quickly access solutions for better, seamless collaboration.

Focusing on one technology provider, where open-source communication can make integration seamless, might be a good adoption route for bank executives to consider in the short term while the industry adapts to overcome these unique challenges.

The Power of Core Processors and What You Can Do About It


core-processor-9-5-16.pngDuring what I would argue was a defining moment of his presidency, Bill Clinton under oath was asked about why he had previously denied that he was in a relationship with Monica Lewinsky. He said his answer depended on the definition of the word “is,” basically that he hadn’t lied because the question had been posed in the present tense and there was no such present relationship. Such dissembling may be maddening when it comes from a president, and it’s equally upsetting when it comes from your business partners.

Few chief information officers have the time necessary to spend pouring through the thousands of pages of the core and IT contracts they sign from each and every vendor. What ends up happening more often than not is that time passes, management changes, renewals occur, technology fades or is upgraded and products are added without scrutiny—all in the name of efficiently running institutions and ensuring a competitive edge.

Any reasonable bank leader could make the assumption that the most current deal takes precedent over the past. Ambiguity is trumped by good faith born from long-term loyalty. After all, why would old agreements govern new technology? Well, it depends on what your definition of is, is.

My company, Paladin fs, was recently retained by a banking client in Massachusetts with $400 million in assets and charged with the task of restructuring each of its core and IT vendor agreements. In our initial research, we saw that for nearly a decade, this bank went to a core processor for account processing, as well as ATM and electronic funds transfer needs, but—curiously—maintained a 13-year relationship with a competing core processor for item processing. It made good business sense to move the bank’s item processing and negotiate a better deal for improved pricing.

With 12 months remaining on the bank’s existing item processing agreement, we calculated the termination expense to be somewhere in the neighborhood of $130,000, based on the “estimated remaining value” for the previous three months multiplied by 60 percent—a standard termination computation. But to our surprise, the core processor had a very different number in mind: $252,000.

When challenged, the core processor happily provided us text and verse from their 2003 agreement with our client. The note was handwritten on paper, but clearly stated that based on its definition of “estimated remaining value,” the company had the right to go back through the entire 13-year relationship with the bank, find the three highest-charged months including taxes, and multiply that total by 60 percent to calculate the accurate “termination for convenience” penalty.

Though we tried, rationalizing with the core processor went nowhere, as it had nothing to gain by being either reasonable or fair—it was losing the business anyway. This illustrates how vendors continually prey on unsuspecting, out-gunned and ill-equipped banks and credit unions. Skilled at garnering trust from bankers, rather than verifications, core and IT vendors know that they will always have the upper hand in professional technology negotiations. They leverage the power of the oligopoly to bilk billions from the community banking industry on a daily basis, while delivering sub-standard products and services that leave institutions wanting. And they do it by choosing their own definitions.

The only way we can combat their cunning nature is with numbers of our own—both by collecting market data, and by coming together as allies. After almost a decade of filling our database with thousands of vendor contract terms and pricing details to help us fight the good fight on behalf of banks, I’ve realized that the oligopoly is just too powerful to take down with a one-bank-at-a-time approach. I’ve now teamed up with Pillsbury Winthrop Shaw Pitman, LLP, and together we’ve built the Golden Contract Coalition (GCC) to tackle vendors’ uncontrollable terms and bad contract deals that our community banks and credit unions fall victim to, time and time again.

An alliance of large groups of community banks, credit unions and key players from within the banking community, the GCC gives us the capacity to leverage our collective influence and untold millions in combined contract value to negotiate fair deals with the unscrupulous core and IT vendors. For the first time in history, the power will be in the hands of institutions, giving us the protection we need to challenge the core and IT vendor oligopoly and end the era of underperforming IT functionality, unfavorable contract terms and one-sided deals.

From here on out, the definitions in our core and IT contracts, will be dictated by those affected most.