In the days of the Wild West, there was gold in the hills. Today, it’s in the databases and onto another discussion of Big Data.
This time it was over dinner and I began by outlining the case for the relevance of Big Data for Banks.
Much of this was covered in my last blog entry on said subject, although there was a little extra froth to the mix.
We began by talking about how banks have moved from a physical space to a remote relationship and, with this change, the fact that banks were originally safe protectors of cash in branches, as that’s where the money was; today, they have to be safe protectors of data online, as that’s where the money is.
But it’s not just about safe protection of data, but using data in three forms: for revenue, for risk and for advice.
Data for revenue is all about the marketing of Big Data in small contexts; data for risk is all about using Big Data to look for fraud and money laundering exposures, or for customers who might be entering domains they should not; and data for advice is all about looking at a customer’s lifestyle,
location and choices and providing augmented services around their lives.
This means that Big Data is far more than data analysis. It is data leverage, data provision, data sourcing and data mining.
It is contextual, location-based, predictive, proactive and personal.
Data is all about taking the mass down to the micro, and then using that to compete.
It is the new battleground but, if it is a battleground, how come we are not seeing many of the incumbent warriors fighting?
Much of the data is there, but the incumbents are too concerned about internal structures to fight the external battles effectively. That is why we are seeing the emergence of front-end fighters like Moven and Simple appearing. These are technology plays to remove the friction of the data structures and provide the proactive and personalised service.
When a bank takes up that challenge, then there will be real innovation, if they have not given that space to others already.
It’s also the back-end data flows and analytics too.
If a bank could take their back-end risk and transaction engines and integrate them with data leveraged front-end tools, then they really will be on to a winner.
But then, as one attendee put it, there are no Big Data projects, just projects for business problems.
And there’s part of the rub, as the technology is only as good as to how it is programmed and if it is programmed to answer the wrong problem, then it is nigh on but useless.
Again, that goes back to last week’s blog about Big Data is all about asking the right questions.
But it goes further than this as much of the bank, and most company data for that matter, is not leveraged at all.
According to Forrester, companies either throw out or don’t bother analysing 88% of the data they hold. Only 12% of data is mined effectively in other words. What would happen if you could mine the whole internal data bank or, even better, combine the internal data flows with external
flows of social media and news? What would happen if every customer movement could be detected and analysed in real-time for AML or fraud, or if every customer footfall could be identified for a potential sale or service?
It’s a dream for some, a reality for few, and much of this is due to internal battles.
There’s a battle between those who want to digitise the bank and those who built the bank. Many of the latter are concerned that the digitisers are trying to destroy what they have built, and this internal friction holds them back.
It holds the customer back too, as many customers are also equally concerned about everything online and would prefer offline or overnight.
The question is not only whether the bank is ready for everything digital with a Big Data backbone, but is the customer?
This is well illustrated by the fact that many customers do not read the T’s & C’s of contracts they accept online. What happens when you use that as a shelter to explain why their identity theft losses is their own stupid fault? It doesn’t wash.
We then got into a whole debate around analysis paralysis, and whether you should really bother harvesting all this data and, if you did, how to sort out what’s relevant versus redundant.
This led us to a lengthy discussion of the Big Data needs of the regulators and that regulatory demands will force us towards more real-time reporting of trades, transactions and flows.
But then the regulator does not have the ability, ammunition or pockets to afford to do the data drilling demanded of such flows, so this does not wash either.
The conclusion of the evening was fairly open, with all agreeing that data is critical but the challenge is to know what data is critical to whom.
Regardless, the core of the debate from where I sat is that the bank used to keep money in branches, and offered safe deposits and guarantees to keep the cash secure. Today, banks keep data in systems, and should be offering secure deposits with online guarantees to keep data secure.
Yep, I’m starting to sound like a parrot.