Realign Your Bank’s Operating Model Before It’s Too Late

core-6-19-18.pngThe banking industry and its underlying operating model is facing pressure from multiple angles. The advent of new technologies including blockchain and artificial intelligence have started and will continue to impact the business models of banks.

Meanwhile, new market entrants with disruptive business models including fintech startups and large tech companies have put pressure on incumbent banks and their strategies. A loss of trust from customers has also left traditional banks vulnerable, creating an environment focused on the retention and acquisition of new clients.

In response to looming industry challenges, banks have begun to review and adapt their business models. Many banks have already adjusted to the influence of technology, or are in the process of doing so. Unfortunately, corresponding changes to the underlying operating models often lag behind technology changes, creating a strong need to re-align this part of the bank’s core functions.

So what does “re-align” mean from an IT architecture point of view?

Impact on System
In order to keep up with the fast-paced digital innovation, investments have largely focused on end-user applications. This helped banks to be seen as innovative and more digital friendly. However, in many cases these actions led to operational inefficiencies and there are several reasons why we see this.

One is a lack of integration between applications, resulting in siloed data flow. More often, though, the reason is the legacy core, which does not allow seamless integration of tools from front to back of an organization. Further, M&A activity has led many banks to have several core legacy systems, and often these systems don’t integrate well or exist with multiple back-end systems that cater to a specific set of products. This complicates the creation of a holistic view of information for both the client and financial advisor.

There are two ways of addressing the above-mentioned challenges to remain successful in the long-run:

  1. Microservice driven architecture
  2. Core Banking System modernization

Microservice-driven architecture
Establishing an ecosystem of software partners is important to be able to excel amid rapid innovation. Banks can’t do all the application development in house as in the past. Therefore, a microservice-driven architecture or a set of independent, yet cohesive applications that perform singular business functions for the bank.

The innovation cycles of core banking systems are less frequent than innovation cycles for client- and advisor-facing applications. To guarantee seamless integration of the two, build up your architecture so it fully supports APIs, or application programming interfaces. The API concept is nothing new; however, to fully support APIs, the use of standardized interfaces will enable seamless integration and save both time and money. This can be done through a layer that accommodates new solutions and complies with recent market directives such as PSD2 in Europe.


Core Banking System Modernization
Banks are spending a significant amount of their IT budget on running the existing IT systems, and this allows only specific parts go into modernization.

A simple upgrade of your core banking system version most likely won’t have the desired impact in truly digitizing processes from front to back. Thus, banks should consider replacing their legacy core banking system(s) to build the base layer of future innovation. This can offer new opportunities to consolidate multiple legacy systems, which can reduce operational expenditures while mitigating operational risks. In addition, a core banking replacement allows for the business to scale much easier as it grows.

A modern core banking system is designed and built in a modular way, allowing flexiblity to decide whether a specific module will be part of the existing core or if external solutions will be interfaced instead, resulting in a hybrid model with best-of-breed applications in an all-in-one core banking system.

Investing In Your Core Can Save You
Core banking system modernization and adoption of the microservice-driven architecture are major investments in re-aligning a bank’s operating model. However, given the rapid technological innovation cycles, investments will pay off in improved operational efficiency and lower costs.

Most importantly, re-aligning the operating model will increase the innovation capabilities, ultimately resulting in a positive influence on the top line through better client experiences.

Regulatory Issues to Watch In 2018

regulation-5-22-18.pngAs 2018 unfolds, all eyes in the financial services industry continue to look to Washington,D.C. In addition to monitoring legislative moves toward regulatory reform and leadership changes at federal regulatory agencies, bank executives also are looking for indications of expected areas of regulatory focus in the near term.

Regulatory Relief and Leadership Changes
Both the U.S. House of Representatives and the Senate began 2018 with a renewed focus on regulatory reform, which includes rollbacks of some of the more controversial provisions of the Dodd-Frank Wall Street Reform and Consumer Protection Act, the sweeping reform passed after the 2008 financial crisis. These legislative actions are ongoing, and the final outcomes remain uncertain. Moreover, even after a final bill is signed, regulatory agencies will need time to incorporate the results into their supervisory efforts and exam processes.

Meanwhile, the federal financial institution regulatory agencies are adjusting to recent leadership changes. The Federal Reserve (Fed), Office of the Comptroller of the Currency (OCC), Federal Deposit Insurance Corporation (FDIC), National Credit Union Administration (NCUA), and Consumer Financial Protection Bureau (CFPB) have new leaders in place or forthcoming, some of whom have been vocal supporters of a more “common sense” approach to financial regulation and who generally are supportive of regulatory relief. In the case of the CFPB, the ultimate direction of the agency could remain uncertain until a permanent director is appointed later in 2018.

Regulators’ Priorities in 2018
Notwithstanding the regulatory reform efforts, following are some areas likely to draw the most intense scrutiny from regulatory agencies during 2018 examination cycles:

Credit-related issues. While asset quality continues to be generally sound industrywide, concerns over deteriorating underwriting standards and credit concentrations continue to attract significant regulatory attention, accounting for the largest share of matters requiring attention (MRAs) and matters requiring board attention (MRBAs).

The federal banking regulators have encouraged banks in recent months to maintain sound credit standards within risk tolerances, understand the potential credit risks that might be exposed if the economy weakens, and generally strengthen their credit risk management systems by incorporating forward-looking risk indicators and establishing a sound governance framework. At the portfolio level, regulators are particularly alert to high concentrations in commercial real estate, commercial and industrial, agriculture, and auto loans, according to the FDIC.

Information technology and cybersecurity risk. The Federal Financial Institutions Examination Council (FFIEC) updated its Cybersecurity Assessment Tool in May 2017. Although its use is voluntary, federal and state banking regulators typically consider a bank’s use of the FFIEC tool or some other recognized assessment or framework as part of their assessment of an organization’s cybersecurity risk management, controls, and resilience.

On a broader scale, in February 2018, the Department of Justice announced a new cybersecurity task force. Although the task force is not directed specifically at the financial services industry, its first report, expected to be released this summer, could provide useful insight into the scope of the task force’s activities and potential guidance into what types of regulatory actions and controls to expect in the coming years.

Bank Secrecy Act and anti-money laundering (BSA/AML) compliance. The industry has seen a steady increase in enforcement actions—some of which have included severe sanctions— when regulators perceived banks had pared back resources in this area too severely. Compliance with Office of Foreign Assets Controls (OFAC) requirements and efforts to prevent terrorist financing are also continuing to draw regulatory scrutiny.

Consumer lending practices. Regulatory priorities in this area are likely to remain somewhat fluid given the leadership changes occurring at the CFPB, where a permanent director is to be appointed by September. Additionally, legislative efforts that could affect the structure and authority of the bureau also are underway.

Third-party and vendor risk management. It has been nearly five years since the OCC released OCC Bulletin 2013-29, which expanded the scope of banks’ third-party risk management responsibilities and established the expectation for a formal, enterprise-wide third-party risk management effort. Since then, regulatory agencies have issued several follow-up publications, such as OCC Bulletin 2017-7, which spells out supplemental exam procedures. Also in 2017, the FDIC’s Office of Inspector General issued a report with guidance regarding third-party contract terms, business continuity planning, and incident response provisions, and the Fed published an article, “The Importance of Third-Party Vendor Risk Management Programs,” which includes a useful overview of third-party risk issues.

Despite the industry’s hopes for regulatory relief in some areas, all financial services organizations should continue to focus on maintaining sound risk management policies and practices that reflect today’s environment of continuing change and growing competitive pressures.

The Advantages of Nearshoring


For tech companies, the main allure of outsourcing lies in the promise of improved cost efficiency. Outsourcing’s popular cousin, nearshoring, has been a solid solution for IT companies over the years and still represents a viable staffing option. But the relevance of nearshoring will most likely increase in the months to come because of the Trump administration’s plans for the H-1B visa program, which is vital to the high tech industry.

Technology enterprises are always transforming, evolving and researching to improve themselves, which is why these companies spend generous amounts of money on recruiting the best teams.

That’s where the outsourcing concept comes to life; a company engages another organization to do some of its work rather than using its own in-house employees. Teams built across the border can get the same job done less expensively while addressing issues that IT companies are having in the current environment.

The tech world is known for pushing forward and past what’s previously been established to pursue a different norm. That’s why IT is so keen to look beyond borders to find the most qualified talent without regard to geographical limitations. To have the entire world as your contracting pool is the best way to face one massive issue in the tech world: the lack of talent per vacancies available. IT companies have many seats to fill and talent is getting harder to find, especially when time-worn, traditional methods of hiring are used.

What nearshoring offers is a large selection of high quality profiles of people with vast knowledge, a high level of self-motivation and the diligence necessary to work within a multinational company. How can you resist enthusiasm and quality? We can’t ignore that immigrants in Silicon Valley created more than half (44 of 87) of America’s startup companies valued at $1 billion dollars or more, according to the National Foundation for American Policy brief in 2016.

Cost Advantage
It’s the holy grail of arguments for outsourcing, and the reason that companies get interested in hiring people abroad in the first place. To assemble a team of great value simply costs less in Singapore or Argentina than it does in the U.S.

Another consideration for nearshoring companies is the expense of building a business in some cities as opposed to others. Just compare two of the biggest IT destinations: It would cost roughly 66 percent more to maintain the same lifestyle in San Francisco—considered by many to be sacred territory for tech companies—than in Buenos Aires. These cost savings would have a significant effect on a company’s bottom line.

The Importance of Proximity
Hiring teams outside the U.S. has become a must to develop growth and create a fully functioning company. With more work and additional responsibilities, there needs to be increased communication with improved methods of delivery. Time overlap due to widely differing time zones can result in high costs and reduced efficiency.

When teams share time zones and are relatively close, geographically speaking, they can more easily coordinate meetings, book flights and share project progress. This translates into better work efficiency because time management goes hand in hand with budgets and deadlines.

Culture Matters
As stated before, effective communication is the foundation of workplace productivity and its importance increases significantly when you have employees in distant places. However, the presiding culture where your remote teams are based also matters. Nearshoring gives businesses the capability of hiring people who share similar values, work habits and sense of urgency. This is key in solidifying trust among team players who are abroad in the way they approach a task or face a challenge.

Even with all the advantages that nearshoring specific tasks and projects can provide, very few companies have yet to try it. This year represents an opportunity to invest in nearshoring teams, specifically in the tech industry, which is currently experiencing widespread uncertainty in response to President Trump’s clear intentions of reforming the H-1B visa program.

Hiring a Chief Technology Officer

Bob DiCosola, executive vice president of Old Second Bancorp, a $2.2 billion asset holding company in Aurora, Illinois, talks with Bank Director digital magazine Editor Naomi Snyder about hiring a chief technology officer with a business background and what the bank will need going forward.

DiCosola briefly touches on the following:

  • What types of information technology people the bank needs
  • Using an in-house advisory team of millennials
  • The bank’s new IT business plan

This video first published in Bank Director digital magazine’s Tech Issue in December.

Successful Tech Implementations Are About People, Not Platforms


To set the stage for a successful technology implementation, it takes more than just training your staff or setting up a platform. You also need to understand your firm’s strategy, culture and people—and know how the new technology will enhance all three.

Increased competition, shifting customer expectations and more diligent enforcement of expanding regulation are making the commercial lending space a tougher place to do business.

It’s true that commercial loan growth is expected to hit 11 percent in 2016. That impressive growth belies a challenging market environment. In truth, lower interest rates and higher costs of doing business are squeezing lenders’ margins. To paraphrase an old joke: Financial institutions need to avoid losing money on every loan, while “making it up in volume.”

Financial institutions that want to avoid becoming a punchline are responding strategically, investing in core technologies and leveraging the new analytical capabilities of solutions to drive product-level profitability.

Due to the increased cost of doing business and the need to operate in a leaner environment, most financial institutions do not possess the internal expertise needed to properly plan and execute enterprise-wide process change. This lack of experience leads to longer success rate times and reduced buy-in by end users and management.

According to the technology and consulting firm CEB, a vast portion of business-led technology adoption happens without the input of information technology, with nearly half of originators saying they are willing to forgo quality for speed. This lack of planning is showing up in the final output. Eighty-five percent of the “stall points” in the adoption of new technology result from a lack of planning, with just 15 percent related to issues around the implementation itself.

So, what are the steps that your financial institution needs to take to ensure the success of your implementation?

Identify What Change is Needed: The institution needs to have a clear idea of what needs to change, and the metrics by which the success of the change will be measured. To make this happen, it’s important to have an executive steering committee, to get senior-level buy-in around a common goal.

Organize the Core Team: Next, it’s important for the institution to assemble a wide variety of cross-functional teams to allow free-thought around how paradigms at the firm could change. Effective tech implementations mean new processes, not just swapping one platform for another. To ensure that those paradigms work across the organization, everyone who will interact with the new system needs representation. For a commercial implementation, this means including representatives from the front line, such as relationship managers, all the way to the back-office, such as loan processors.

Choose a Champion: Financial institutions have to make a careful choice of the person who will be the face of change. The final candidate should be a strong communicator who can explain the benefits of the change to everyone in the organization.

Thoroughly Scope Business Requirements: Conduct a full scoping and planning session with your technology provider to understand business rules, as well as the existing systems and processes that need to change as a result of implementation.

Identify Key Implementation Resources: The financial institution needs to allocate the correct resources to maximize productivity and efficiency throughout the implementation. The creation of an implementation steering committee (separate from the executive steering committee we discussed above) can also help minimize design changes that need to occur after the implementation begins. This steering committee should also ensure that the changes have no unintended consequences for the organization.

Create Program/Project Plans: Will the project be an enterprise-wide implementation or focused on a specific area? Is the organization better suited to a waterfall or agile implementation methodology? (A waterfall implementation tends to be linear and sequential, while an agile approach is incremental and iterative, with processes occurring in parallel.) Will the rollout of the new technology be a “big bang” or phased? These issue need to be addressed in advance, along with the standard project timeline.

Most importantly, financial institutions need to work deliberately and efficiently, and avoid hurrying the process. Rushing into a mistake will delay the outcome longer than working slowly and effectively. As they say in the Navy SEALS: “Slow is smooth, and smooth is fast.”

Careful planning can significantly reduce the amount of time that a financial institution requires for the implementation, and speed up the time to realize the benefits. The key is planning right—from the beginning.

Don’t Crash and Burn: Choosing a Data Center Wisely

7-11-14-article.pngWe live in an always-on world where transactions are made around the clock. When it comes to online and mobile banking, consumers expect their information to be available where and when they need it. Financial institutions must strike a balance between providing consistent and innovative services while maintaining highly secure systems.

According to a Celent 2013 report on IT spending, the business of maintaining existing systems to simply “keep the lights on” consumes 77 percent of the information technology (IT) budget. This doesn’t leave much room for IT staff to be proactive, and when downtime occurs, the immediate negative impact is significant.

Downtime costs include loss of business, potential maintenance fees, and additional costs incurred even after service has been restored. This doesn’t factor in the measurable cost to brand and customer loyalty, including lost revenue, reputation damage and lost employee productivity.

A growing number of banks and credit unions have been plagued by service outages. According to Keynote Systems, Inc., a California company that tracks service disruptions, more than 70 percent of banking outages are caused by computer changes, traffic overloads, upgrades gone awry, and other technical issues.

How Do You Measure the Impact of Outage?
A study conducted by CA Technologies of 200 North American companies provides a broad indication of the impact of IT downtime on community banks and credit unions. The company found that financial institutions are hit harder than other businesses by outages—and that smaller companies lose a higher percentage of their revenue during downtime and recovery than larger enterprises.

As banks and credit unions encourage customers to rely on online and mobile banking to meet their financial needs, these outages become much more than just a big inconvenience for customers. To more accurately understand the total impact of the loss, consider what your brand is worth. The long-term effect of a damaged reputation can have a significant impact on revenue and profitability.

How Can You Address the Issue?
How should financial institutions assess and address the threats around the potential costs that downtime and outages pose to the business?

Many credit unions and banks have turned to third-party vendors to host their self-service banking solutions such as mobile or online banking. The attractions of this arrangement are many: they include lower cost of ownership, access to full-featured online/mobile solutions, and the ability to redirect resources to growing the business.

The risk is that these financial institutions are placing their hard-earned consumer relationships at the mercy of their vendor’s expertise and facility infrastructure.

Choosing a Data Center You Can Trust
When financial institutions are dissatisfied with their own solution or the performance of their third-party vendors, the first order of business is to identify a vendor with a data center that comes closest to being technologically impregnable. There are three main criteria that should be used to evaluate and understand the data center’s ability to protect the consumer relationship.

  1. Availability
  2. Security and compliance
  3. Service response time/continuity

What questions should a bank or credit union be asking to ensure these criteria are met?


  • What physical characteristics of the data center could impact uptime?
  • Does the system have the reserve capacity to accommodate sudden surges in demand?
  • Does the data center have a single point of control?
  • Does the vendor use a stringent software management and change control process (a process to manage and document all changes for quality control purposes)?
  • What monitoring and planning tools are in place?

Security and Compliance

  • Is the data center compliant with the latest industry standards and requirements?
  • What security measures does the vendor use to protect hosted data?
  • Does the vendor use third parties for additional security monitoring?


  • What architectural features protect customer data and minimize downtime?
  • What business continuity and disaster recovery plans are in place?

The Balancing Act
Customers want financial institutions to be trustworthy, secure and reliable while consumers expect a certain level of service with few disruptions. In order to innovate and compete effectively, financial institutions must be able to offer a highly secure, reliable, online banking solution with redundant capabilities that ensures a consistent and positive user experience.

In the final analysis, an attractive interface, a broad feature set, and even ease of integration with core processors are meaningless without the reliability of a robust, secure, redundant data center.

Breaking Barriers: A Global Information Security Study

barriers-wp.pngWith increasing business demands and evolving regulatory frameworks, information security is a top priority for financial services industry (FSI) organizations. This year’s security survey study conducted by Deloitte finds that many FSI organizations have become more proactive in implementing innovative security measures and creating greater awareness of information security within their businesses. However, most organizations in the survey are challenged with balancing the cost of information security initiatives with the perceived risks of sophisticated threats and emerging technologies.

The following summary highlights the responses from over 250 financial services organizations from 39 countries:

Stronger Together: Silos and Barriers Retreat

  • Almost two-thirds of respondents believed that their information security function and business are engaged.
  • Over 50 percent of respondents indicated that they have a strong working relationship with operational risk management. Close to half of respondents indicated that they have strong relationships and coordinated activities with enterprise risk management.
  • Information security governance; identity and access management; and information security strategy and roadmap are cited to be the top security initiatives for this year.

Adapting to New Technologies: Security Innovation

  • As the use of social media increases, 37 percent of respondents are revising organizational policies; and 33 percent are educating users on social networking to address the security risks.
  • Many surveyed organizations have explored cloud computing options. However, 40 percent of the respondents indicated they still do not use cloud computing. The reasons cited include technology prematurity, security risks, and adoption capabilities of the organization.
  • As a part of their mobility program, many organizations have already deployed, or plan to deploy, mobile VPN, central device management, and mobile device management software. However, more than 50 percent of respondents have not yet planned for deployment of anti-phishing software, employee and customer-facing applications, and data loss prevention for mobile devices.

Policing Cyber Threats: Safeguarding Data Assets

  • Three out of four respondents have dedicated privacy resources; organizations are increasingly focusing on protecting their sensitive information and formalizing the privacy function.
  • Forty-nine percent of surveyed organizations claim to actively manage vulnerabilities, 82 percent of which are also actively researching new threats to proactively protect their environment from emerging threats.
  • Most surveyed organizations use the Security Operation Center (SOC) to monitor traffic and data and actively respond to incidents and breaches.
  • More than half of the respondents indicated that their organizations manage the SOC internally to get a better understanding of information security issues and gain more control over their operations.
  • Consistent with prior years, respondents cited a lack of sufficient budget (44 percent) and the increasing sophistication of threats (28 percent) as the primary barriers to implementing an effective information security program.

Sector Highlights: Banking

As banks adapt to increased financial regulatory pressure and adopt new technologies to stay competitive, they are challenged with managing myriad vulnerabilities and business expectations.

The following highlights the responses from 158 banking organizations, making up 62 percent of respondents:

Maturity Paradox: How To Keep The Information Security (IS) Program Effective

  • With increasing regulatory pressure, banking respondents continue to enhance their security programs. Close to 80 percent of respondents believe that their information security programs have reached a Level 3 (set of defined and document standard processes with degree of improvement over time) maturity or higher.
  • Even as security practices mature and advance, nearly 25 percent of the banking respondents indicated they experienced security breaches in the past 12 months.
  • Excessive access rights, security policies and standards that have not been operationalized, and lack of sufficient segregation of duties are cited as the top three external audit findings by banking respondents.

Balancing Act: Security and Cost Containment

  • Even though more than 70 percent of banking respondents dedicate at least 1 to 3 percent of their IT budget to information security, lack of sufficient budget and/or resources is cited as the top barrier for an effective information security program.
  • Nearly half of banking respondents have already implemented or purchased cloud computing services. Of those who have not implemented cloud computing services, close to 90 percent of the respondents believe the benefits outweigh the security risks.
  • Vulnerability scanning and penetration testing (72 percent) is the top information security function that is outsourced to a third-party. This is followed by threat management and monitoring services, at 24 percent.

Security Innovation: New Technologies and Their Risks Have Arrived

  • Nearly 75 percent of the banking respondents are making use of social media; 20 percent of the banking respondents have deployed technical controls to block or limit organizational usage.
  • When it comes to adoption of mobile devices, banking respondents indicated that the top three security controls are enhancing the consumer acceptable use policy, integrating consumer device security into awareness campaigns and enforcing complex passwords.

To view more results, please download the full study.

The Board’s IT Check-Up

tech-health.jpgAt the conclusion of Bank Director’s recent board compensation survey co-sponsored with Meyer-Chatfield Compensation Advisors, we followed up with some of our respondents who reported being overwhelmed by information technology concerns. Directors have the responsibility of ensuring their banks are keeping up with IT threats and safeguards, but for some, keeping on top of IT to the satisfaction of regulators is becoming increasingly frustrating and time-consuming. 

Paul Schaus, president of CCG Catalyst, a consulting firm that works with banks in regulatory compliance and technology planning, spoke with Bank Director about what directors should be considering when handling IT at their bank.

BD: What is changing about the board’s IT responsibility?

Boards are in some aspects in a transition phase. Now the regulators want them to have more oversight and know more what’s going on because they are legally responsible.  Just having a community member on the board isn’t the only requirement.  Having somebody with expertise to bring to the table is becoming more of a factor in banking.

So what you are seeing is more diversification of knowledge because directors are responsible for that oversight. You’ve seen the change in the larger banks.  It’s slowly working its way down.

The board has to do what is reasonable based on its size, where it’s located, and its infrastructure. The problem is that the regulations are written in more of a vacuum. Regulators get under pressure like anybody else. 

BD: What are some steps boards can take to address this change?

It’s healthy for a board to evaluate itself, to say, ‘Do we have the right people and do we need to bring some more people on the board?’  If a director can’t add anything to the board, and you can’t train him because he’s not a finance guy or a tech guy, a regulator could look at that as the board having poor judgment. 

So do your due diligence, listen to the experts, and when you don’t know, go get outside advice.  There is nothing wrong with saying, ‘we don’t know and we need outside help.’  Make sure what you are doing is not putting too much stress or risk on the bank itself, including the directors personally.

If I was sitting on the board of a bank, from my perspective, I would look at my personal risk.  That’s how you have to look at things. If a board member doesn’t feel comfortable about something, his view should be voiced.  The last thing you want is to have a regulator come in and talk to your board and the regulator makes a comment, ‘you do understand?’ and someone says, ‘no, I don’t.’  The regulator knows you didn’t know what you were doing when you approved something in the first place.

BD: What should boards be cautious of when taking a more proactive role in IT?

Some boards really go beyond what the rules require, and they create subcommittees that are technology oriented.  The [chief information officer] will work with that subcommittee heavily. There’s nothing wrong with banks that are getting more involved; it’s just that it can lead to some micromanagement issues.  There is a line. If the directors are going to start micromanaging the bankers, then do they have the right people in the right positions?

The board has to rely upon the expertise of the people that are working at the bank, and if that expertise is not there, then they have to question if they have the right people.  That’s the board’s responsibility. 

BD: Could you leave us with some questions directors need to be asking about IT?

Yes. Here they are:

  1. Are we confident we have a clear and viable IT strategy that supports our business strategy?
  2. Are we making capital investment decisions about technology proactively or reactively?
  3. Is our technology strategy customer-centric?
  4. Are we making measurable and sustainable progress toward integrating our IT at the enterprise level, or are we still predominantly a silo-focused organization?
  5. Is our technology usage moving us measurably and sustainably toward greater operating efficiency?

Consumer Adoption and Usage of Banking Technology

FD-WhitePaper2.jpgToday’s consumers, especially those known as Millennials and Gen Y, are used to having technology integrated into most aspects of their work and personal lives. Banking is no exception. To respond to changing customer expectations, banks, credit unions and other financial institutions have incorporated online and mobile technology into consumers’ banking experiences. However, financial institutions still need to answer several questions pertaining to banking technology:

  • How well are financial institutions meeting the needs of consumers when it comes to offering high-tech products and services?
  • Whom do consumers view as the trusted provider of the mobile wallet?
  • How does adoption of banking technology vary for different consumer groups?

This white paper answers these and other questions that are critical to the ongoing success of financial institutions in a rapidly evolving marketplace. The paper is based upon the findings of a recent online research study of 2,000 U.S. consumers conducted jointly by First Data and Market Strategies International. The “New Consumer and Financial Behavior” study assessed consumers’ attitudes, behaviors, desires and technology adoption. This white paper is the third in a series of four based on results of the study and focuses on consumers’ attitudes and behaviors related to technology in banking.

Topics include:

  • Consumers’ attitudes about, and adoption of, banking-related technology.
  • Usage of mobile banking.
  • Perceptions of the mobile wallet by different consumer groups.
  • Usage of online banking and bill payment.
  • Steps that financial institutions can take to appeal to various types of consumers.