IoT: Is Your Bank Ready?


internet-of-things-11-1-17.pngWhat if your fridge could sense the absence of a milk container and automatically reorder the milk for delivery? What if your car could sense the deflation of a tire, alert the driver and order roadside assistance service? IoT, or the internet of things, is a sensor-based technology that connects objects with sensors embedded in them for data transmission and monitoring over the internet.

IoT is making a lot of this possible. Bank boards should get ready for a future where many more devices are connected through the internet, which will increase exponentially the amount of transactions going through banks. Many of the security questions raised by the IoT-connected world have not been answered yet.

These sensors send and receive signals and carry interactions to and from other IoT devices or systems enabled with IoT technology. So, important implications of this technology are very large and continuous volumes of data flowing from IoT devices and impacting banking systems.

Some examples of impacts to banking systems include:

  • Banks will be improving features and capabilities to support more sophisticated consumer-based transaction processing, including IoT-based transactions.
  • With new banking technology integration and infrastructure investment, consumers will have increased access to detailed information regarding our most important IoT-based transactions and more options to manage finances surrounding these transactions.
  • Consumers will see new transaction reporting for IoT in our banking consoles.

Also, since IoT is an integrated form of data and information transmission, many new types of devices beyond common types such as cell phones, tablets and other kinds of mobile devices have the potential to tap into banking infrastructure.

Newer devices like refrigerator consoles or onboard computer systems in vehicles have the capability to transmit transactions for purchases that impact today’s banking architecture.

By one estimate, the market for IoT platforms, software, applications and services will grow from $170.57 billion in 2017 to $561.04 billion by 2022, a compound annual growth rate of 26.9 percent.

So, because of this, customers will need additional services on the banking side of IoT transaction processing to understand what types of transactions (and from which devices) are included in their bank accounts. Many of today’s customers are used to real-time bank account information and portal login for easy viewing of transactions. So, it is very likely that this new IoT capability for banking would be expected to come in at the same level for all forms of consumer banking.

Understanding how banking computer systems and infrastructure will be adjusted and upgraded to accommodate the influx of IoT-enabled transactions will play a crucial role in supporting customers and clients globally. Consumers will be most impacted by changes in retail and consumer markets. However, business use of IoT for financial transaction flows is also a growing factor. So, the combined business and consumer IoT sensor-driven transaction flows is an exciting area of banking and computing convergence that holds great potential for new and emerging global markets.

Defining, Adopting and Executing on Fintech


fintech-9-5-17.pngFintech has become a convenient (and amorphous) term applied to virtually any technology or technology-enabled process that is, or might be, applied within financial services. While the technologies are complex, the vast array of the current wave of fintech boils down to three simple dynamics: (1) leveraging technology to measure or predict customer need or behavior; (2) meeting customer need through the best customer experience possible; and (3) the ability to execute more nimbly to evolve products and services and how they are delivered.

Every reasonably well-versed person in fintech knows that the ability to predict customer need or behavior is achieved through a strong data infrastructure combined with a high-quality analytics function. But what defines the quality of the customer experience? At Fundation, we believe the quality of the customer experience within financial services is determined by the convenience, simplicity, transparency, intuitiveness and security of the process by which a product or service is delivered. The challenge for many financial services companies in developing the optimal customer experience lies in the rigidity of their legacy systems. They lack the flexibility to continually innovate products and services and how they are delivered.

The distinct advantage that fintech firms like Fundation have over traditional financial services companies is the flexibility gained from building their technology infrastructures from scratch on modern technology. With in-house application development and data operations capabilities, fintechs can rapidly engineer and, more importantly, reengineer the customer experience and their business processes. The capacity to reengineer user interface (UI), user experience (UX) and back-end processes is a major factor in the ability of financial services companies to maintain a competitive edge in the digital era where customers are accustomed to engaging with the likes of Google, Amazon, Facebook and Apple in their digital lives.

Banks Remain Well Positioned to Win With Fintech
Armed with these capabilities, we, like so many fintechs, could be thumping our chests about how we are going to transform banking. But at Fundation, we see the future differently. We believe that the biggest disruption to banking is not going to come from outside of the banking industry—it’s going to come from the inside. A handful of banks (and maybe more) will reengineer their technology and data infrastructure using modern systems and processes, developed internally and augmented through highly integrated partnerships with fintechs. As a result, these banks will generate superior financial returns and take market share as customers migrate to firms that provide the experiences they expect.

In addition to enjoying a lower cost of capital advantage versus fintechs, we believe banks are well positioned for three other reasons. First, banks will remain the dominant choice of customers for financial products given their brand strength and existing market share. Second, banks have far more data than the average fintech that can be used to develop predictive analytics to determine customer need or behavior. Third, and perhaps most important, banks have what we at Fundation call the “trust asset:” their customers trust that they will protect their information and privacy and that they will recommend products best suited to their needs.

Be the Manufacturer or the General Contractor
Banks are in a strong position to win the fintech revolution but what remains are the complexities of how to execute. There are a few basic strategies:

  1. Do nothing
  2. Manufacture your own capabilities
  3. Operate as the general contractor, aligning your institution with third parties that can do the manufacturing
  4. Some combination of manufacturing and general contracting

For banks that are predominantly in relationship-driven lines of business rather than transactional lines of business, doing nothing is viable for now. The pressures on your business are not as severe, and a wait-and-see approach may enable you to make more informed decisions when the time is right.

For others, doing nothing is fraught with peril. Assuming that you choose one of the remaining three options, the implementation process will be hard, but what may be even harder is the change in organizational psychology necessary to execute on your decision. Resistance to change is natural.

That is why fintech initiatives should be driven top-down. Executive leadership should command these initiatives and set the vision. More important, executive leaders should explain why the institution is pursuing a fintech initiative and why it has decided to build, partner or outsource. Explaining why can reduce the natural resistance to, and fear of, change.

Manufacturing your own capabilities is hard work but has advantages. It provides maximum control over the project and limits your vendor management risk. The downside is that the skill sets required to execute are wide-ranging. That said, building in-house doesn’t mean that everything needs to be proprietary technology. Most fintech platforms are a combination of proprietary technology along with third-party customized components. Should you elect to build off of third-party software, you must ensure that the platform is highly configurable and customizable. If you don’t have significant influence over customization, you will lose the opportunity to reengineer the processes necessary to rapidly innovate and evolve.

Being the general contractor isn’t easy, either, but banks are very adept at it. You could make the argument that most banks are just an amalgamation of business lines, each of which employs a different system (mostly third-party) and are already operating as general contractors. The business line leaders we have come to know have significant experience managing critical third-party vendors and therefore have the skill set and knowledge to manage even the most innovative financial technology partners. What’s more, they often know what they would want their operating platforms to do, as opposed to what they are built to do today.

Should your institution decide to outsource services to a fintech firm, it is paramount to align interests. Banks should embrace their fintech counterparty as a partner, not simply a vendor. Welcome the flexibility that they offer, and allow them to empower your institution to innovate and evolve.

Don’t Squander the “Trust Asset”
In a world where Amazon, Google, Facebook and Apple dominate the digital landscape, deliver ideal customer experiences, and may possess a “trust asset” of their own, the status quo is not an option, no matter how painful change can be. If your financial institution intends to compete over the long term, executing on a fintech road map is vital, moving towards infrastructures with a foundation of flexibility. Over the next decade, flexibility will allow financial services companies to compete more effectively by delivering the products, services and experiences that customers will demand. Flexibility is what will allow your institution to maintain its competitive position over the long term.

How Big Data is Helping Live Oak Bank Prevent Hacking


hacking-8-9.png

While there’s a myriad of technologies and companies on the market trying to make banking data more secure and prevent hacking, knowing which technologies and partners to choose from can be a daunting task. With cyber criminals looking for any conceivable way to get into banking systems, monitoring for potential threats can seem almost impossible. Think of cyber security as a house with multiple “points of access” for potential burglars, like windows or doors. The problem is that each digital access point, from branch networks to remote data centers, presents a distinct set of cyber security problems. This often leads banks to involve multiple software, solutions and partners. The result can be a disjointed cyber security strategy where banks are spread thin dealing with multiple vendors and systems.

That’s precisely the issue that Live Oak Bank—a North Carolina institution specializing in small business loans—faced as the bank’s employees were looking to improve their cybersecurity. Part of Live Oak’s promise to its customers is top-notch cyber security, but with their systems, the bank struggled to gain visibility into every potential point of access that cyber criminals might seek to exploit.

“We really wanted to protect ourselves across the board,” recalls Thomas Hill, chief technology officer at Live Oak. “But we had to address each potential security issue with separate technologies, which quickly became overwhelming. You’ve got to monitor all these devices and systems all the time, and be on top of them if—and when—a hacker comes in.”

Hill and Live Oak began evaluating options to respond to breaches quickly when they happen, and possibly detect them ahead of time. They decided to partner with Seattle-based cybersecurity company, DefenseStorm.

“Live Oak needed visibility into all areas of its network to support company-wide security and operational activities,” explains DefenseStorm chief technology officer Sean Cassidy. “With branches, staff and data centers located across the U.S., [employees] had multiple systems to monitor each point of access. They needed a way to consolidate visibility into each system, while still allowing the systems to continue operating as intended.”

DefenseStorm’sstack of cybersecurity capabilities includes real-time incident reporting, automated initial threat response and—most importantly—a proprietary big data engine built specifically for banks to analyze metadata patterns that could be indicative of a hack. Live Oak was then able to aggregate all their cyber security logs and event data into one analysis engine—with the objective of increasing visibility of security threats, and speeding up reaction time to potential breaches. They did this by implementing software that aggregates data from existing systems, and places it all into a single, easy-to-monitor dashboard. Incident tracking for compliance purposes also became more efficient, allowing the bank to report cyber incidents to state and federal regulatory agencies sooner than before.

“DefenseStorm’s incident response system allows me to not only easily see data indicating a potential hack, it allows me to immediately assign it to one of our engineers,” says Hill of Live Oak. “It really empowers them to focus in-depth on potential threats and dig deep to see if there’s a hack underway.”

DefenseStorm also continues to provide Live Oak with 24-hour monitoring and support through its so-called Guardian team, who are also responsible for offering assistance in investigating—and uncovering—potential threats. The Guardian team provides advice and recommendations to Live Oak on how to better secure its network in the future. This underscores the trend of“threat hunting,” as businesses and organizations seek to be more proactive in how they monitor systems for potential hackers.

Live Oak’s previous security system was unable to perform accurate and timely security analyses, mainly due to the increasingly large amount of data traffic occurring on the bank’s networks. Reaction time to security incidents has been greatly reduced, says Cassidy.

Finally, one of the most unique parts of this partnership is that Live Oak has chosen to participate as a proof of concept customer for features and capabilities of DefenseStorm’s software that are in the final stages of development. “With all the tools and support they provide, DefenseStorm is really turning out to be a Swiss Army knife for us—and potentially the entire banking industry,” says Hill. “This partnership has been a huge win.”

When It Comes to Core Conversions, Look Before You Leap


core-conversion-7-13-17.pngChanging your bank’s core technology provider is one of the most important decisions that a bank board and management team can make, and even when things go smoothly it can be the source of great disruption. The undertaking can be particularly challenging for small banks that are already resource constrained since the conversion requires that all of the bank’s data be transferred from one vendor’s system to another’s, and even for a small institution that can add up to a lot of bits and bytes. Also, changing to another vendor’s core technology platform typically means adopting several of its ancillary products like branch teller and online and mobile banking systems, which further complicates the conversion process.

“It isn’t something to be taken lightly,” Quintin Sykes, a managing director at Scottsdale, Arizona-based consulting firm Cornerstone Advisors, says of the decision to switch core providers. “It is not something that should be driven by a single executive or the IT team or the operations team. Everybody has got to be on board as to why that change is occurring and what the benefits are…”

The Bank of Bennington, a $400 million asset mutual bank located in Bennington, Vermont, recently switched its core technology platform from Fiserv to Fidelity National Information Services, or FIS. President and Chief Executive Officer James Brown says that even successful conversions put an enormous strain on a bank’s staff.

“It’s not fun,” says Brown. “I have the advantage of having gone through two previous conversions in my career, one that was horrendous and one that was just horrible. [The core providers have] gotten better at it, but there’s no way to avoid the pain. There are going to be hiccups, things that no matter how you prepare are going to impact customers. There’s this turmoil, if you will, once you flip the switch, where everybody is trying to figure out how to do things and put out fires, but I will say [the conversation to FIS], in terms of how bad it could have been, was not bad at all.”

But even that conversion, while it went more smoothly than Brown’s previous experiences, put a lot of stress on the bank’s 60 employees. “There was a lot of overtime and a lot of management working different jobs to make sure our customers were taken care of,” he says.

Banks typically change their core providers for a couple of different reasons. If the bank has been executing an aggressive growth strategy, either organically or through an acquisition plan, it may simply have outgrown its current system. A lot of core providers can handle growth, particularly in the retail side of the bank, so that’s not usually the problem, Sykes says. Instead, the growth issue often comes down to the breadth of the bank’s product line and whether staying with its current core provider will allow it to expand its product set. When banks embark on a growth strategy, they don’t always consider whether their core data system can expand accordingly. “Usually they’re unable or just haven’t looked far enough ahead to realize they need it before they do,” Sykes explains. “The pain has set in by the time they reach a decision that they need to explore [switching to a new] core.”

Banks will also switch their core providers over price, especially of they have been with the same vendor through consecutive contracts and didn’t negotiate a lower price at renewal. “If any banker says price doesn’t have an impact on their decision, they’re not being honest,” says Stephen Heckard, a senior consultant at Louisville, Kentucky-based ProBank Austin.

Although the major core providers would no doubt argue differently, Heckard—who sold core systems for Fiserv for 12 years before becoming a consultant—says that each vendor has a platform that should meet any institution’s needs, and the deciding factor can be the difference in their respective cultures. And this speaks to a third common reason why banks will leave their core provider: unresolved service issues that leave the bank’s management team frustrated, angry and wanting to make a change.

“The smaller the bank, the more important the relationship is,” says Heckard. “When I talk about relationships, I’m also talking about emotions. They get played up in this. For a community bank of $500 million in assets, quite often if the vendor has stopped performing, there’s an emotional impact on the staff. And if the vendor is not servicing the customer’s needs in a holistic manner, and the relationship begins to degrade, then I do feel that eventually the technology that’s in place, while it may be solid, begins to break.”

Heckard says that core providers should understand their clients’ strategic objectives and business plans and be able to provide them with a roadmap on how their products and services can support their needs. “I don’t see that happening near enough,” he says. And if the service issues go unresolved long enough, the client may begin pulling back from the provider, almost like a disillusioned spouse in a failing marriage. “They may not be as actively attending user groups, national conferences and so forth,” Heckard says. “They don’t take advantage of all the training that’s available, so they become part of the problem too.”

Brown says that when Bank of Bennington’s service contract was coming up on its expiration date, his management team started working with Heckard to evaluate possible alternatives. “We needed to implement some technology upgrades,” he says. “We felt we were behind the curve. Something as simple as mobile banking, we didn’t have yet.” The management team ultimately chose FIS, with Brown citing customer service and cybersecurity as principal factors in the decision. The decision was less clear cut when it came to the actual technology, since each of the systems under consideration had their strengths and weaknesses. “I’m sure [the vendors] wouldn’t like to hear this but in a lot of ways a core is a core,” Brown says.

Heckard, who managed the request for proposal (RFP) process for Bennington, says that bank management teams should ask themselves three questions when choosing a new core provider. “The first one would be, have you exhausted every opportunity to remain with the present vendor?” he says. As a general rule, Heckard always includes the incumbent provider in the RFP process, and sometimes having the contract put out to bid can help resolve long-standing customer service issues. The second question would be, why was the new vendor selected? And the third question would be, how will the conversion restrict our activities over the next 18 months? For example, if the bank is considering an acquisition, or is pursuing an organic growth strategy, to what extent will the conversion interfere with those initiatives?

Heckard also covers the conversion process in every RFP “so that by the time the bank’s selection committee reads that document they know what’s ahead of them, they know the training requirements…they understand the impact on the bank.”

And sometimes a bank will decide at the 11th hour that a core conversion would place too much stain on its staff, and it ends up staying with its incumbent provider. Heckard recalls one bank that he worked with recently decided at the last moment not to switch, even though another vendor had put a very attractive financial offer on the table. “The president of the holding company told me, ‘Steve, we can’t do it. It’s just too much of an impact on our bank. We’ve got a main office remodel going on,’ and he went through about four other items,” Heckard says. “I thought, all of these were present before you started this. But sometimes they don’t realize that until they get involved in the process and understand the impact on their staff.”

Banking on the Cloud: Why Banks Should Embrace Cloud Technology


cloud-technology.png

Cloud adoption has reached critical mass, with roughly 90 percent of businesses employing its technology in some facet of their organization. The cloud presents opportunities for enhanced efficiencies and flexibility—without any security trade-offs—so it’s no surprise that we’re seeing more organizations shift to the software as a service (SaaS) model. But while we’ve seen the healthcare, legal and insurance industries evolve, banks have been more reluctant to adopt new technologies built outside of their own walls.

Why Banks Lag at Cloud Adoption
The banking industry is not known for being nimble. As one of the oldest, largest and most vital industries in the U.S. economy, banking has, in some ways, fallen victim to inertia—relying on traditional technologies and internal networks to disseminate its services. This is in large part due to the widely-held belief that on-premise solutions are inherently more secure than the cloud because data lives in proprietary servers and systems, rather than a service provider’s environment. However, research shows that cyber attacks affect both environments, with on-premise users experiencing over twice as many web application attacks as service provider customers, on average.

Still, for many banks, the perceived risks of the cloud outweigh its forecasted benefits. In fact, 73 percent identified security concerns as the main reason for avoiding it, while 63 percent listed privacy issues as their top worry. That perception is beginning to change, as the cloud’s business advantages have become too significant to ignore. A recent study found big banks are expected to grow from as little as zero percent public cloud adoption to 30 percent by 2019—a dizzying adoption rate for an industry that still relies on legacy systems from the 1960s.

For those still wary of making the switch, here are three of the biggest benefits of moving to the cloud:

Security
Cloud technologies boost your security in ways that on-premise systems are unable to. Traditionally, to use a new offering, you install an on-premise server in your datacenter. Then you must configure network, firewall and secure access to the server. This stretches resources by increasing training requirements, which ultimately detracts from the goal of the offering. Due to economies of scale, cloud companies can own the server, the networks and the processes making the entire offering more complete and secure.

With strict protocols and security certifications like SOC2 and ISO27001 built into many services, banks can ensure that the cloud is accessed and enabled securely for any solution provider they work with.

Understanding the value of security and the benefits that cloud technology brings to banks, a handful of institutions are leading the shift and others are expected to follow. Capital One Financial Corp., an early adopter of Amazon Web Services (AWS), has steadily built its infrastructure in the cloud over the past two years. The company continues to work closely with AWS on specific security and data protocols, allowing the company to operate more securely in the public cloud than it could have in its own data centers, according to Capital One CIO Rob Alexander.

Efficiency and Scalability
The cloud enables teams to be more agile than ever. The SaaS model gives teams the ability to be flexible and enable new interations on-demand. This access to real-time commentary empowers teams to ship updates more quickly and frequently and to push the envelope so they’re constantly improving products to align with what customers are looking for.

By leveraging the cloud to store complex data, organizations can meet ever-evolving regulatory compliance and governance rules mandating data protection. A recent example would be financial institutions working to comply with the EU’s General Data Protection Regulation. The ability to meet regulations can be sped up by a number of the cloud’s features, including built-in auditability for more clarity around your compliance status, and virtual infrastructure that reduces room for error.

On top of addressing infrastructure models, the cloud allows businesses to be elastic. For instance, being able to address the mass amount of credit card purchases on Cyber Monday and expand for that specific demand, rather than having to buy new servers to address the one day-per-year demand.

Overhead Cost Savings
Switching from on-premise to cloud can mean significant savings on overhead costs.

When you work with a SaaS provider, you no longer need to invest in proprietary infrastructure. Instead, you’re able to access and maintain your data through your partner’s established environment. This cuts down on both the up-front capital costs associated with hardware and the continuous costs that eat up budget to keep hardware and software optimized and refreshed.

Rather than pay a flat fee to keep systems up and running, cloud providers offer a variety of metered, pay-per-use options. These include Salesforce and Microsoft Office 365’s pay-per-seat, AWS’ infrastructure as a service (IAAS) pay-per-hour model, and Oracle’s high integration fees.

By outsourcing services to the data center, you can also realize savings on staffing. On-premise technologies can require a team varying in size from one to dozens, depending on the bank’s size. Because your cloud provider takes on the computing, your internal team no longer has to worry about hardware refreshes or server and software updates, freeing up their time to focus on what matters most: your business. Cost savings can also be reinvested into the business to increase headcount, boost wages and drive product innovation.

Cloud technology has already been embraced by businesses in numerous industries, but banks have been slower to acknowledge its benefits. Now, as cloud’s positive impact on security, efficiency and cost come to the forefront, it’s becoming harder for banks to ignore the advantages. Already, we’re seeing early adopters reap the benefits, from a financial standpoint and innovation perspective, and in the coming years, we can expect to see banking in the cloud transition from a “nice-to-have” to a business-critical approach to moving up in the market.

Scotiabank Partners with Sensibill to Digitize and Track


Sensibill.png

It’s tax time again, and for many people across the U.S. and Canada that entails one major headache—organizing and managing receipts. Whether it’s an individual or business, keeping, organizing and categorizing receipts is critical to maximizing tax deductions, not to mention for good general fiscal management purposes.

However, one Canadian bank is partnering with a fintech innovator to make receipt management much more of a breeze for their customers. Just last year, Toronto-based Scotiabank announced a partnership with Canadian fintech company Sensibill to offer a mobile receipt management solution called eReceipts that will to make it easier for Scotiabank customers to manage their finances. The eReceipts app serves as an extension to Scotiabank’s mobile banking application and digital wallet.

Scotiabank is one of Canada’s largest banks, serving more than 23 million customers across the dominion and 50 countries outside Canada. And at 184 years of age, Scotiabank is older than Canada itself. With over $1 trillion in total assets, Scotiabank invests more than $2 billion per year in technology initiatives. Partnering with Sensibill to create eReceipts was a natural fit, as it’s a Toronto-based startup that was incubated through Ryerson University’s Digital Media Zone initiative. Sensibill has grown to become a white-label software provider of software solutions to help banking customers better manage receipts from both desktop and mobile.

While there has been technology available to aid in receipt management, it’s still incredibly difficult to categorize and drill down into the detail of specific receipts, especially on a mobile device. What makes the eReceipts functionality so unique is that it’s the first app to automatically match specific credit and debit card transactions to the right receipt. After making a purchase, customers can take a photo of the receipt directly from their Scotiabank banking app. Then, through a combination of Optical Character Recognition and machine learning software, the receipt is matched to the proper transaction in the user’s account history. When users drill down into the transaction, information from the receipt has already been extracted, structured and presented in a clear, easy to navigate format. Scotiabank customers can see all the information about a receipt they need without ever having to look at a piece of paper.

Scotiabank customers have been interacting with eReceipts an average of 38 times per month to track both personal and business expenses. So in addition to making their customers’ lives easier, eReceipts is increasing engagement with Scotiabank’s mobile application—and with it the potential to reduce overall customer attrition rates as users continue to rely on it. Receipts can also be categorized as business or personal, and can be annotated, tagged and stored in folders. In fact, around 48 percent of users utilize folders to organize expenses. Hashtags can also be assigned to receipts for ease of search purposes, along with receipt text itself being searchable. And when tax time rolls around, all receipts can be exported in PDF format, along with a matching Excel or CSV file to make preparation easier.

Scotiabank is the first of Canada’s five largest banks to roll out an application like eReceipts that can automatically match paper receipts to the corresponding transaction. Although there are solutions on the market that can capture receipts, eReceipts is the first to extract and contextualize data on such a granular level. Sensibill’s unique deep machine learning, combined with a powerful receipt processing engine, can even associate product names and SKUs with transactions. The result is that otherwise vague transactions become extremely clear when users begin to drill down. Usage of eReceipts has exceeded initial targets by upwards of 300 percent, with positive reviews and shares springing up organically.

In the future, Scotiabank may be able to leverage this additional data to improve customer experience and enhance revenue. Having access to consumer purchase history at the item-level could help Scotiabank better understand, and anticipate, their customers’ needs and preferences. The goal is to better personalize the banking experience, and offer targeted banking products or services based on an analysis of receipt and purchasing history. For example, if Scotiabank notices that a couple is purchasing items like cribs, baby formula and diapers, it might assume there’s a baby on the way and begin marketing a 529 College Savings Plan. In fact, Sensibill is already working to add an “insights” component for partners like Scotiabank, so that customer data generated by eReceipts can be more effectively extracted, organized and analyzed.

The partnership between Scotiabank and Sensibill is noteworthy because it tackles a problem that everyone seems to face in the physical world. With eReceipts, the two companies are taking a huge step towards helping people stay organized, maximize their tax benefits and know exactly how they’re spending their money.

And perhaps most importantly, eReceipts points to a world where we can finally toss that musty old receipt-filled shoebox in the closet.

This is one of 10 case studies that focus on examples of successful innovation between banks and financial technology companies working in partnership. The participants featured in this article were finalists at the 2017 Best of FinXTech Awards.

The Time Is Now for Artificial Intelligence in Commercial Banking


commercial-banking-4-25-17.pngYears ago, I had the good fortune to work for a bank with pristine credit quality. This squeaky-clean portfolio was fiercely protected by Ed, one of those classic, old-school credit guys. Ed had minimal formal credit training, and the bank had no sophisticated modeling or algorithms for monitoring risk. Instead, we relied on Ed’s gut instincts.

Ed had a way of sniffing out bad deals, quickly spotting flaws that our analysts had missed after hours of work. He couldn’t always put his finger on why a deal was bad, but Ed had learned to trust himself when something felt “off.” We passed on a lot of deals based on those feelings, and our competitors gladly jumped on them. A lot of them ended up defaulting.

Obviously, Ed wasn’t some kind of Nostradamus of banking. Instead, he was spotting patterns and correlations, even if he was doing it subconsciously. He knew he’d seen similar situations before, and they had ended badly. Most banks used to be run this way. It was one of those approaches that worked well—until it didn’t.

When Ed’s Not Enough
Why? Because some banks didn’t have as good a version of Ed. And some banks outgrew their Ed, and got big enough that they couldn’t give the personal smell test to every deal. Much of the industry simply ran out of Eds who had cut their teeth in the bad times. A lot of banks were using an Ed who had never seen a true credit correction.

It also turns out that humans are actually pretty bad at spotting and acting on patterns; the lizard brain leads us astray far more often than we realize. It was true even for us; Ed kept our portfolio safe, but he did so at a huge opportunity cost. The growth we eked out was slow and painful, and being a stickler on quality meant we passed on a lot of profitable business.

The surprising thing isn’t that banks still handle credit risk this way; the surprise is how many other kinds of decisions use the same approach. Most banks have an Ed for credit, pricing, investments, security, and every other significant function they handle. And almost all of them are, when you get right down to it, flying by the seat of their pants.

Bankers have spent decades building ever more sophisticated tools for measuring, monitoring, and pricing risk, but eventually, in every meaningful transaction, a human makes the final decision. Like my old colleague, Ed, they base their choices on how many deals like this they have seen, and what the outcomes of those deals were.

These bankers are limited by two things. First, how many experiences do they have that fit the exact same criteria? Usually it numbers in the dozens or low hundreds, and it’s not enough to be statistically significant. Second, are they pulling off the Herculean task of avoiding all the cruel tricks our minds play on us? The lizard brain—that part of the brain that reacts based on instinct—is a powerful foe to overcome.

Artificial Intelligence’s Time Has Come
This shortcoming, in a nutshell, is why artificial intelligence (AI) and machine learning have become the latest craze in technology. Digital assistants like Siri, Cortana and Alexa are popping up in new places every day, and they are actually learning as we interact with them. Applications are performing automated tasks for us. Our photo software is learning to recognize family members, our calendars get automatically updated by things that land in our email, and heck, even our cars are learning to drive.

The proliferation of the cloud and the ever-falling costs of both data storage and computing power mean that now is a real thing that is commercially viable for all kinds of exciting applications. And that includes commercial banking.

Banking & AI = Peanut Butter & Jelly
In fact, we think banking might just be the perfect use case for AI. All those human decisions, influenced up to now by gut feel and scattered data, can be augmented by machines. AI can combine those disparate data sources and glean new insights that have been beyond the grasp of humans. Those insights can then be presented to humans with real context, so decisions are better, faster and more informed.

The result will be banks that are more profitable, have less risk, and can provide customized service to their customers exactly how they need it, when they need it most.

Using Big Data to Assess Credit Quality for CECL


CECL-4-7-17.pngThe new Financial Accounting Standards Board (FASB) rules for estimating expected credit losses presents banks with a wide variety of challenges as they work toward compliance.

New Calculation Methods Require New Data
The new FASB standard replaces the incurred loss model for estimating credit losses with the new current expected credit loss (CECL) model. Although the new model will apply to many types of financial assets that are measured at amortized cost, the largest impact for many lenders will be on the allowance for loan and lease losses (ALLL).

Under the CECL model, reporting organizations will make adjustments to their historical loss picture to highlight differences between the risk characteristics of their current portfolio and the risk characteristics of the assets on which their historical losses are based. The information considered includes prior portfolio composition, past events that affected the historic loss, management’s assessment of current conditions and current portfolio composition, and forecast information that the FASB describes as reasonable and supportable.

To develop and support the expected credit losses and any adjustments to historical loss data, banks will need to access a wider array of data that is more forward-looking than the simpler incurred loss model.

Internal Data Inventory: The Clock is Running
Although most of the data needed to perform these various pooling, disclosure and expected credit loss calculations can be found somewhere, in some form, within most bank’s systems, these disparate systems generally are not well integrated. In addition, many data points such as customer financial ratios and other credit loss characteristics are regularly updated and replaced, which can make it impossible to track the historical data needed for determining trends and calculating adjustments. Other customer-specific credit loss characteristics that may be used in loan origination today might not be updated to enable use in expected credit loss models in the future.

Regardless of the specific deadlines that apply to each type of entity, all organizations should start capturing and retaining certain types of financial asset and credit data. These data fields must be captured and maintained permanently over the life of each asset in order to enable appropriate pooling and disclosure and to establish the historical performance trends and loss patterns that will be needed to perform the new expected loss calculations. Internal data elements should focus on risks identified in the portfolio and modeling techniques the organization finds best suited for measuring the risks.

External Economic Data
In addition to locating, capturing, and retaining internal loan portfolio data, banks also must make adjustments to reflect how current conditions and reasonable and supportable forecasts differ from the conditions that existed when the historical loss information was evaluated.

A variety of external macroeconomic conditions can affect expected portfolio performance. Although a few of the largest national banking organizations engage in sophisticated economic forecasting, the vast majority of banks will need to access reliable information from external sources that meet the definition of “reasonable and supportable.”

A good place to start is by reviewing the baseline domestic macroeconomic variables provided by the Office of the Comptroller of the Currency (OCC) for Comprehensive Capital Analysis and Review (CCAR) and Dodd-Frank stress testing (DFAST) purposes. Because regulators use these variables to develop economic scenarios, these variables would seem to provide a reasonable starting point for obtaining potentially relevant historic economic variables and considerations from the regulatory perspective of baseline future economic conditions.

Broad national metrics—such as disposable income growth, unemployment, and housing prices—need to be augmented by comparable local and regional indexes. Data from sources such as the Federal Deposit Insurance Corporation’s quarterly Consolidated Report of Condition and Income (otherwise known as the call report) and Federal Reserve Economic Data (FRED), maintained by the Federal Reserve Bank of St. Louis, also can be useful.

Data List for CECL Compliance

critical-internal-data-elements-small.png

Looking Beyond Compliance
The new FASB reporting standard for credit losses will require banks to present expected losses in a timelier manner, which in turn will provide investors with better information about expected losses. While this new standard presents organizations of all sizes with some significant initial compliance challenges, it also can be viewed as an opportunity to improve performance and upgrade management capabilities.

By understanding the current availability and limitations of portfolio data and by improving the reliability and accuracy of various data elements, banks can be prepared to manage their portfolios in a way that improves income and maximizes capital efficiency.

Growing the Loan Book Through Automation


lending-1-25-17.pngThere are a million reasons for leveraging fintech to enhance a financial institution’s small business lending experience. To name a few, there’s better efficiency, customer convenience, profitability, speed to decision, speed to capital, cost reductions and a much-improved overall customer experience. However, one that often gets lost in the fray is the impact technology will have on the day- to-day productivity, motivation and morale of the bankers who work so hard to source and sell small business loans. This “banker experience” as it is known, plays a huge role in sales performance, retention, revenue generation and employee satisfaction.

The reality of a day in the life of a small business lender is that a surprisingly small amount of time is spent on sourcing new opportunities or even cross-solving to sell deeper into an existing relationship. Because they are shackled with the responsibility of shepherding deals through the multiple steps in the lending process, the more loan deals a banker has, the less time he or she is able to spend growing the book of business. So how are they spending their time?

  • As many as 80 percent of applications come in either incomplete or with an error on them, delaying the decisioning process and requiring the banker to go back to the client again and again.
  • Unique borrowing situations prompt the back office to request additional information requiring the banker to reach out and coordinate the collection of the information.
  • The collection of documents in the “docs and due diligence” phase of the approval process is tedious and time consuming. Bankers spend a great deal of time reaching out to applicants asking for things like: entity docs, insurance certificates, tax returns and so on.
  • Multiple teams and individuals touch each deal and as a result, things get lost, forcing the banker to invest a great deal of time and energy babysitting deals and checking on their progress from application to closing.
  • Much of the processing time is dependent upon the borrower’s promptness in getting requested information back to the bank. Bankers spend countless hours making multiple calls to collect information from clients.

I ran small business sales for a $150 billion asset institution, and our data proved that whenever a banker had as little as two loan deals in the workflow process, their new business acquisition productivity was reduced by 50 percent. Bankers with five deals in the process had their acquisition productivity diminished by 75 to 80 percent. That’s because they expend all their time and energy shepherding deals through the various stages of the process, gathering additional documentation, or monitoring the progress of each deal.

All of this is challenging for one person to do… but simple for technology to handle automatically, accurately and consistently. Technology can ensure an application is complete before it is submitted. It can ping the client for any-and-all documentation or data required. It can communicate progress and monitor a deal at every step in the lending process. Technology can also facilitate the collection of more and better data and translate that data into information that enables the banker to add value by asking great questions that help solve more problems for the customer.

When technology is used end-to-end, from application to closing, bankers are able to focus on the important things like:

  • Sourcing new opportunities.
  • Cross-solving for existing customers.
  • Preparing for sales calls and follow-up activities to advance the sales process.
  • Providing clients and prospects the value that earns trust and feeds future revenue.
  • Growing their loan book, and their portfolio revenue.

Technology makes the banker’s life simpler. When bankers are able to do what they do best, which is sell, job satisfaction, performance, job retention and morale go through the roof. And that positivity translates into improvements in the customer experience, and increases in revenues for the institution.

Core Provider Ranking: FIS Satisfies More Bank Executives


core-provider-12-30-16.pngBank executives don’t exactly give their core providers a ringing endorsement in Bank Director’s Core Provider Ranking, conducted in September and October 2016, particularly when it comes to these companies’ willingness to integrate with third party applications and their ability to offer innovative solutions.

Eighty-six executives, including chief executive officers, chief information officers and chief technology officers, rated the overall performance of their bank’s current core provider, and within individual categories that explored aspects of the provider’s service to the client bank, on a scale of 1 to 10, with 10 indicating the highest level of satisfaction. An average score was then calculated based on the individual ratings. Participants were not asked to rate other core providers. The executives surveyed represent banks between $100 million and $20 billion in assets. Forty percent of respondents indicate that Fiserv is their bank’s core provider, while 26 percent use FIS and 19 percent Jack Henry. Sixteen percent indicate that they use another provider.

While respondents express some disappointment in what is likely their biggest vendor relationship, one core provider does come out on top.

#1 FIS

Average overall score: 7.18

According to 67 percent of its customers, FIS, headquartered in Jacksonville, Florida, is the only core provider that keeps pace with innovations in the marketplace.

FIS has been the most active acquirer of the big three core providers. David Albertazzi, a senior analyst at Aite Group, says FIS has a great track record of acquiring and integrating innovative companies into the firm’s suite of products. Beginning in 2012, FIS has acquired six firms, according to crunchbase, a data firm that tracks the technology sector. These include two compliance solutions, a payments technology company and a mobile banking solution. Its most recent acquisition was the software firm SunGard, in 2015.

FIS features nine different core systems in the U.S. The company came out on top within all individual categories but one, rating highest for being a cost effective solution, communicating with clients about new products and updates, providing high quality support, offering innovative solutions and for the company’s willingness to integrate with third-party applications.

#2 Jack Henry & Associates

Average overall score: 6.63

Jack Henry, based in Monett, Missouri, came in just behind FIS in many of the individual categories, but rated highest of the three when it comes to being easy to contact and responsive when issues and problems arise. Albertazzi says customer service is a core tenet for the company, and Jack Henry regularly measures how well its IT and support staff are performing. Those efforts are clearly recognized in the industry.

Jack Henry offers a more streamlined product selection compared to FIS and Fiserv— according to Aite, just six core systems. Recent acquisitions include Banno, in 2014, a mobile account platform, and Bayside Business Solutions in 2015, which expanded the provider’s commercial lending suite.

#3 Fiserv

Average overall score: 4.97

Brookfield, Wisconsin-based Fiserv features 18 core systems, according to Aite—the most of the three core providers. That variety, along with its ubiquity in the banking space—Fiserv serves one-third of all U.S. banks and credit unions—may account in part for its low rating.

Client perceptions of their core provider’s performance can be muddied by several factors, including the age of their core system, says Albertazzi. The client bank may be loath to take on a conversion, and instead remain on an old system that the provider is no longer fully supporting. Bank Director did not rank individual systems, but rather the companies’ performance overall. A client running an outdated, basic core would be more apt to criticize a vendor than one using a shiny new system tailored to integrate with the latest-and-greatest fintech solution on the market.

If acquisitions have the potential to jumpstart innovation for legacy core companies, Fiserv could see a boost soon. Fiserv has been a significantly less active acquirer in recent years, compared to Jack Henry and FIS, with just one acquisition in 2013. But Fiserv recently acquired Online Banking Solutions, an Atlanta, Georgia-based provider of business banking technology, which promises to deepen Fiserv’s relationships with commercial banks.

As a group, other providers averaged a score of 6.07, just above the overall average for all providers of 6.02. One-quarter of retail banks could end up opting for startup providers for their online and mobile banking solutions by 2019, predicts Stessa Cohen, research director at Gartner, a research and advisory firm. Currently, 96 percent of banks rely on their core provider for services outside of core banking, according to Bank Director’s 2016 Technology Survey. As banks open up to other technology vendors, it’s possible they’ll lessen their dependency on the legacy core providers, and even open up to newer core solutions.