I often wonder if I’m actually telling the truth when I have a go at banks for being slow to change and poor with technology. They can’t be that bad can they?
But then I meet a banker who tells me I’m spot on, and that they’re just as fed up with the state of the bank’s technology as I am.
By way of example, none of us would use the same mobile telephone we owned ten years ago would we? And yet, that’s what banks are effectively doing.
The last time most banks refreshed or reviewed core systems was in 1995 when they were worried about Y2K and the millennium bug. Since then, most have just further ingrained their core processing to ensure they’re unfit for the future.
How can I say this?
I can say it with three examples that were shared with me in the last week.
First, the UK faster payments system. A revelation and innovation? Or a system based upon card processing standards that were generated over three decades ago? Both, as it happens.
Second, a national payments infrastructure that processes almost $50 billion a day using messaging standards developed for the movement of magnetic tapes in the 1970s.
Third, a national payments infrastructure that limits the message field sizes to 12 characters, as that was the maximum length of a field that could be held on punch cards when the system was developed.
The net result of all this is that we move money around the world with the most limited data possible. The limitation of the data is down to the generations of development dating back to the 1960s, much of which was at a time when mag tapes and punch cards were the way you programmed systems.
Rather than break away from those structures, banks have maintained the essence of them because it more difficult to replace an infrastructure than to maintain and evolve it.
This means that messages store only the most basic information – name, account number, amount, date – rather than the functionally rich big data world we live in today.
For example, one country runs its whole payment processing system through a data set that contains less than a gigabyte of data in totality. In fact, this country runs a service that contains about 200 megabytes of useful customer data daily, in total. And that's not much.
Again, the reason for this is that the system was developed for a generation before for the magnetic tape and punch card era, rather than developed for today.
Today, we generate 2.5 exabyte’s of big data every day through the social networks and less than a few gigabytes through the bank network.
No wonder we have a problem.
Think of it another way.
When man landed on the moon in 1969, the total compute power of the Apollo 11 is about what you get in a USB stick today (or less).
Would you want to fly to the moon on a USB stick when you could be flying on Space X or Virgin Galactic?
Why are we running our systems on a USB stick?
Purely because we have locked ourselves into a structure designed for the Apollo age rather than the internet age.
What this really means is that if the likes of an Elon Musk turned their attention to the core transactional infrastructure of the financial system, then they might find ways to allow exabytes of data to enrich that system and refresh it.
But then if that happened, what would the existing infrastruture be there for, except to communicate with bygone era of banks that they deal with. Meanwhile, Elon's service would create a new infrastructure and, in so doing, a new market.
A little like Bitcoin and PayPal had to reinvent digital exchange, someone will reinvent the infrastructure for transactions and transaction banking.
After all, building a payments infrastructure from scratch will not be easy, but surely it’s better than running such infrastructure on a system designed for punch cards and magnetic tapes?
This is the first in a three-part series on why our technology and infrastructure is unfit for purpose: