I wrote about data as a currency, like salt, on Monday and thought I’d continue this theme but extrapolated out.
I’ve talked a lot already about data as a currency and information wars:
- money being meaningless, as data is now key;
- the battle over information, and how information warfare is the new game;
- the fact that banks should move from being safekeepers of money to being safekeepers of data;
- why banks should worry about Google, Apple, Facebook, and how these guys do data management better than banks;
- the fact that banks need to radically shake up and wake up to the data challenge; and
- how Amazon would organise itself as a bank, in terms of how bank silos inhibit their competitiveness.
All of these blog entries are really about BIG DATA.
Big Data is talked about a lot these days, and can be found all over the press. Here’s a recent discussion from Computer Weekly:
Along with the increasing ubiquity of technology comes the increase in the amount of electronic data. Just a few years ago, corporate databases tended to be measured in the range of tens to hundreds of gigabytes. Now, multi-terabyte (TB) or even petabyte (PB) databases are quite normal. The World Data Center for Climate (WDDC) stores over 6PB of data overall (although all but around 220TB of this is held in archive on tape) and the National Energy Research Scientific Computing Center (NERSC) has over 2.8PB of available data around atomic energy research, physics projects and so on.
In other words, we are drowning in data today.
In the first version of Karl’s presentation, he picked up a few facts about Big Data:
- There are over 2.7 billion searches performed on Google each month.
- The number of text messages sent and received every day exceeds the population of the planet.
- There are about 540,000 words in the English language . . .
- About 5 times as many as during Shakespeare’s time.
- More than 3,000 new books are published . . . daily.
- It’s estimated that a week’s worth of New York Times contains more information than a person was likely to come across in a lifetime in the 18th century.
- It’s estimated that 40 exabytes (that’s 4.0 x 1019) of unique new information will be generated worldwide this year.
- That’s estimated to be more than in the previous 5,000 years.
- The amount of new technical information is doubling every 2 years.
- It’s predicted to double every 72 hours by 2010.
A later version from February 2010 is in video form …
… and makes the point that there are over 31 billion searches performed on Google each month, compared to 2.7 billion just four years earlier.
Now we have twitter, with over one billion searches every DAY on their website alone … something that didn’t even figure in the 2010 Shift Happens video.
- In 2010, Google revealed that it has more than a billion searches a day and averages a billion searchers a week. Today, it also has a billion unique visitors per month.
- Microsoft’s Bing search engine gets 905 million unique visitors and Facebook 714 million
- Facebook logged 250 billion minutes of usage worldwide in May 2011, up 66 percent from May 2010.
- Facebook’s average US visitor engagement has grown from 4.6 hours to 6.3 hours per month
These latter stats on user engagement is the reason why, in the a recent version of Shift Happens (pdf download), we find a lot about social media, mobile and stuff:
- There are 7 billion people on the planet, and five billion have mobile phones
- Today, more people have access to a mobile phone than a clean toilet
- There are 2 billion people on the internet and 750 million on Facebook
- 92% of Americans have an online presence by the time they are two years old …
And the Big Data piece arises again:
- The amount of digital information worldwide will increase 44x between now and 2020
- 247 billion emails are sent every day
- 6.1 trillion text messages were sent last year
- The average teenager sends 3,339 text messages per month
Now I blog these sorts of stats regularly, but it struck me that this drowning in data conundrum is something we are not tackling well.
For example, fifteen years ago, we talked about 1:1 marketing, data mining, propensity modelling and more using data warehousing systems that would trawl through a terabyte of data in days.
Now, we need to trawl through exabytes of data in seconds. Soon, we’ll need to analyse yottabytes in nanoseconds.
As data challenges grow, solutions will emerge … but solutions are always trailing, are too slow to arrive and we have got to get better as, back in 1996, we also talked about a vision of informediaries.
The concept of the infomediary believed that individuals control their data destiny, and would share such data in return for value received back.
That’s data as a currency.
But data is not a currency today.
Data is an abused asset.
Technology firms such as Apple, Facebook, Google and Amazon use data liberally and yes, offer some value in terms of making us buy more or receive more adverts based upon our digital footprints.
Meanwhile, banks have all this data and offer little in terms of information leverage at the point of transaction … something that will change as data analytics becomes the competitive differentiation.
As mentioned, when data is a currency, it is not the data that is important but the knowledge it offers that can be extracted by the data miners.
Just as salt was a currency, it was not the salt that was important but the quality of the salt extracted by the salt miners from different seas and rocks.
Data is a currency … we just haven’t realised yet how much value this currency offers.