In my recent posts, I’ve made mention of spending the last 5+ years working in the technology space. While I’m thrilled to be back (yes, I cut my teeth in the late ’90s with Bank Director), I am far better versed in the nuances of various technologies that support growth plans than in my first go round. I’m also better able to understand how companies create data-inspired decision making cultures to better capture market share. So as I find myself on the road nearly every week, I made a point to jot down some notes for today from a Digital Publishing Summit I recently attended.
Before you click away, the correlation between the media/publishing industry and those working within a financial services organization is a lot tighter than you might imagine. At our collective cores, managing data is a business fundamental. So I was amused/interested to see a tweet from the host company’s CEO following the conference that said “Myth: only 3 companies have #bigdata. Facebook, Google, and the Govt.” It strikes me that our recent economic crisis has reinforced the need for updating practices and systems for risk management, customer management, fraud detection and compliance. So I guess you know which industry I’m naturally including with that list!
Now I admit, I’m a real fan of Dave Kellogg’s writings; relevant to today’s piece, his company, MarkLogic facilitates the use of deep data analytics and predictive marketing. As I begin to better understand how the financial community consumes massive amounts of consumer data, a primer of sorts with respect to something called “Big Data.” Maybe you’ve heard the term; if not, consider how you and your staff devised your customer retention and new customer acquisition programs. If you are a mid- to large institution, the rhetorical question has to be data, data, data…
A bit of background:
In simple terms, the phrase “Big Data” refers to the tools, processes and procedures that enable an organization to create, manipulate, and manage very large data sets and storage facilities. If you spend any time at all around your CTO, CIO or lead quant, you know how partial they are to talking about “new” ways to minimize the time between when data is generated and the time when information is available for analytics. For them, being asked for their ideas on making sense of your data as volumes get big, and the sources of data generation become disparate and uncoordinated = their favorite question. So today’s entry allows me to share a few questions I’ve heard non-banking executives ponder that might also be on your mind; namely:
- Why is it so difficult to just get all the data you can possibly get your hands on and make it available for immediate analysis?
- What exactly is “Analytics?”
- Do the terms “Statistical Inference” and “Data Mining” completely and accurately describe this field?
As this all returns us to business intelligence, a former colleague once told me that the promise of Big Data is “the ability to do deep, exploratory, quantitative, ad-hoc analysis to understand what’s going on with your business. From a technologist’s perspective, what’s cool about where we are today relates to new solutions and innovative trends in managing and analyzing this influx. From an executive’s vantage point? A report recently issued by McKinsey opines “that deploying these technologies to create networked organizations that foster innovative collaboration among employees, customers, and business partners is highly correlated with market share gains.”
So what does this mean for me?
One of the main players in this space says it best:
…As the financial services industry continues to embrace web and network-based computing, the level of transactional event data being generated has grown exponentially. From algorithmic trading to electronic personal banking, the volume of data that financial institutions must cope with for the purposes of compliance, risk management, and customer relationship management has become a significant challenge.
OK, we all agree that it’s become relatively easy to collect data. However, organizations of all sizes continue to wrestle with a fundamental challenge: making sense of what they’ve collected given the speed with which new types of data are generated (and the sheer volume of that data). Personally, I see this expansion of information as a great source of competitive advantage. And if you’re looking to increase revenue and profitability through organic growth, you’ll find a number of exceptional companies (e.g. EMC-owned Greenplum, IBM-owned Netezza, venture-backed Aster Data, etc.) focused on solving this one great problem. So my advice today? Spend a few minutes thinking about your company’s data management strategies. Than, go grab whichever techie you’re most comfortable with, and ask him/her to tell you what it actually is.