Home / Technology / Unimaginable and unmanageable risks, or total transparency and control?

Unimaginable and unmanageable risks, or total transparency and control?

Just sat through a day of academic debate about the financial crisis and how much technology was to blame.

We’ve had these blame games many times in the past, usually to try to point a finger at an individual like Greenspan or Brown, so taking the finger to point to an inanimate pile of metal processors was going to prove interesting I thought.

But it wasn’t, as this was run by the London School of Economics who pretty much made the day a disinfected affair with professors emeritus pondering and pontificating.

Lse1

As one of my friends said: “an academic is someone who looks at something working in practice and wonders what it would be like in theory”, and this was the case here.

However, there were a few brighter spots, including addresses by CIOs from the European Central Bank and Royal Bank of Scotland.

The real point of the whole day was to point to the origins of the crisis – the rich and diverse world of derivatives – and to say that the complexity of quantum analytics that drove us down the spiral of debt was due to the systems handling our formulas in such a way that it made it look like risk was managed … but it wasn’t.

In other words, the computers messed up.

One speaker pointed out that risk was hidden because regulators focused upon individual financial institutions instead of systemic risks across the industry.

Another talked about the origins of the Black Scholes system, and said that “it wasn’t technologists who caused the crisis, but physicists”.

Another mooted the scale of computing, and how complex analytics had moved us into grids, data centres and clouds, that provided unlimited processing and scalability. Hence, what could never have been mapped, simulated or contemplated before could now just be modelled and deployed overnight.

Whatever your view, systems are a contributor to this crisis, but it’s not the systems that caused it but the people who programmed them. So here’s my potted view of how technology exacerbated the crisis and what will happen next …

Back in the 1960s, markets were inefficient and ran on open outcry systems where lots of men stood in pits shouting at each other.

These men were “chaps” and “blokes”, with the chaps being the ones who wore top hats and the blokes bowler hats.

The top hat brigade didn’t like the bowler hat lot – mainly because blokes spoke with common accents shouting out things like “cor blimey guv’nor, catch a load of this stock at five and ten” – and so they found a new-fangled thing called a computer and wondered what they could do with it.

Luckily, two men called Fisher Black and Myron Scholes came up with an exceedingly complex formula which they called the Black-Scholes formula as they were very imaginative.

Black scholes formula 

The formula meant that if you were buying or selling stocks, you could break the buy or sell order into pieces and manage the risk by placing the purchase with other related instruments in a derivative.

Luckily, this formula was perfect for the computer age and allowed the chaps who could afford such technologies to trade bits of equities.

This was no big deal as the processors back then were not very sophisticated – a mainframe would have been the equivalent of your Nokia mobile of five years ago – but it did allow some complex analysis to begin.

In particular, it allowed the age of leverage to start, and introduced new disciplines in market and credit risk.

This bubbling area of derivatives and risk didn’t really take off until the 1990s, when systems had become more and more distributed, powerful and capable. Such systems enabled the chaps to do more hedging and complex investment strategies began. This was further supported by electronic trading, which had also increased in prevalence after the automation of the main markets in New York and London known, over here, as the Big Bang.

Now things were getting a little more efficient, and markets started referring to exotics and “the Greeks”. And risk became more destructive as a result with Nick Leeson destroying Barings Bank; Long Term Capital Management (LTCM) almost blew up the financial world; Frank Quattrone was indicted over the internet boom for misrepresenting IPOs; and Henry Blodget got into trouble for mixing securities research with securities trading.

What was really happening is that the mixture of market greed and gaps in regulations allowed many to create more and more complex risk.

Risk management was evolving and trying to keep up, as were the lawmakers, but creative and innovative masters of the universe were seeing the opportunity to combine processing power and automated trading with arbitrage and exotic instruments to create ever increasing returns at the expense of those who did not have the ability to make these combinations work.

And yes, there were some big deals like LTCM but the markets coped.

That was until David Li’s formula cropped up.

David Li’s formula is the one that almost killed Wall Street, as Wired Magazine so eloquently put it, and it goes like this:

David li

No big deal.

But it is a big deal as it created a number of false assumptions and operations.

First, it made traders believe credit risk was managed and covered, when it wasn’t.

Second, it ignored real world assets to simulated models, and hence separated two key areas that were mutually inclusive and made them mutually exclusive.

Third, it didn’t incorporate the new forms of market risk we now look towards, namely liquidity risk and systemic risk.

And the real issue that occurred is that the trading systems leveraged the formula to death because, just as this formula was released, electronic trading moved a step forward into algorithmic trading and high frequency trading.

This is why the FSA’s Prudential Risk Report 2011 shows that credit went through the roof over the last forty years.

Debt

Similarly, worldwide, OTC Derivatives exploded from a market worth $100 trillion in 2000 to $300 trillion in 2005, growing at thirty percent year on year. It then gathered momentum to grow at forty percent per annum from 2005 through 2008 reaching a peak of almost $700 trillion when the crisis hit.

This debt and credit explosion was a result of the dangerous concoction of leverage, OTC derivatives, unregulated markets, complex analytics and unlimited processing capacity.

This heady mixture had stepped up the financial game into markets that were unmanageable.

From a systems viewpoint, the technology enabled and supported this explosion but was not the cause. The cause is the humans who program the systems.

But the systems capabilities are illustrated well by the fact that, in 2000, the New York markets were processing around 5,000 electronic trade movements per second. This has now risen to levels of over ten million per second.

The financial markets have flared up server processing on an unprecedented scale.

For example, RBS Global Banking & Markets are using over 20,000 server blades for core market processing today, compared to a single processor two decades ago.

And, as the BBC reported last week, those processors are now operating at near the speed of light: “Trades now travel at nearly 90% of the ultimate speed limit set by physics, the speed of light in the cables”.

How fast is that?

Well the speed of light travels at around 299,729,458 metres per second, or near 300,000 kilometres per second, so systems are moving trades around at about 270,000 kilometres a second.

Pretty darned fast if you ask me.

In real life, Professor Doctor Roman Beck of the e-finance laboratory of the Goethe University Frankfurt, says that we’re moving electronic orders around at 5.8 milliseconds between London and Frankfurt. That’s about 52 trips between here and Frankfurt in the time it takes you to blink your eyes (a blink being around 300 milliseconds).

Eye

Not bad.

So, we have these completely automated systems processing everything in lightning fast speeds globally with all the opportunities for a bit of a byte of a stock or commodity being built into complex arbitrage systems with unlimited scale and processing power.

Sounds like a recipe for a disaster if you ask me.

And it has been.

But it’s also been a recipe to allow some firms, such as Goldman Sachs, to generate $100 million profit every day that they’re open for business. Consistently.

So where does this leave us?

In a bit of a bind I guess.

We’re not going to get rid of these systems, processors and capabilities are we?

But equally, can we effectively control and regulate them?

I think not.

As the general counsel of Salamon Smith Barney is quoted in Liar’s Poker: “My role is to find the chinks in the regulator’s armour”, and that attitude prevails.

So whilst systems look for chinks, the fragmentation, complexity and geographic spread of systems and regulations allow for arbitrage … and that makes money.

This is why there is no way to demand and force transparency on the markets. For all the calls of the FSA for real-time liquidity reporting, that reporting is meaningless if the Shanghai or San Paolo operation of the financial firm is leveraged to the hilt.

And the idea of a global response to this is also unlikely.

As one speaker said here: “I don’t trust any solutions that claim to be global as most global projects fail”.

Maybe he was involved in the GSTPA or similar ventures.

So the solution: to continue to try to regulate in hindsight and hope that the banks, in foresight, don’t create unsustainable or unmanageable risks.

Or just pray!

 

About Chris M Skinner

Chris M Skinner
Chris Skinner is best known as an independent commentator on the financial markets through his blog, the Finanser.com, as author of the bestselling book Digital Bank, and Chair of the European networking forum the Financial Services Club. He has been voted one of the most influential people in banking by The Financial Brand (as well as one of the best blogs), a FinTech Titan (Next Bank), one of the Fintech Leaders you need to follow (City AM, Deluxe and Jax Finance), as well as one of the Top 40 most influential people in financial technology by the Wall Street Journal’s Financial News. To learn more click here...

Check Also

AI in banking is all about risk management

It is the nature of finance that, at its core, is risk. Insurance is all …

  • Fritz Thomas Klein

    Just a word on global market infrastructures: There is the success story of CLS. It did help a lot in the rough days after the demise of Lehman’s!

  • Jacques Bayens

    New regulation: any Buy or Sell operation will take effect after N hours only, where N = 24, or maybe 12.
    Anything faster would be driven by speculative purposes, and mankind or society does not benefit from speculation. Speculation generates huge wealth for the biggest robber barons while putting all the systemic risk on every other human. It’s just wrong.
    How would Smith Barney counter that ?

  • Bob Giffords

    I welcome your observations. Quite clearly extreme connectivity – technical, financial (and regulatory though not referenced in your blog) was a key driver in the crisis as Professor Mainelli and I argued in “The Road to Long Finance” (CSFI, 2009). However, merely noting the speed and complexity of modern markets does not really help except to frighten people into more banker-bashing or other knee jerk reactions. You correctly express doubt at the end that we can do anything about it. I believe we can, but first we must recognise that this hyper-speed and complexity is part of the evolution to machine-to-machine (M2M) e-commerce, also known as the semantic web. While these speeds are fast for us poor humans, they are normal for machines. The reason we cannot turn back the clock is that we desperately need the benefits of these robotics to support our fast growing global population on our rather limited (but still abundant) planet. Once we recognise this fact, we can then set about trying to work out how to manage it, which I am currently doing (hopefully). I would encourage others to join me in lighting candles, rather than just cursing the darkness. However, recognising the problem is indeed the first step.

  • Jacques Bayens

    Bob, can you point out any evidence that near-light-speed trading is being used for somehow helping with population growth ? I was cynically assuming that the only goal was making huge profits for private entities, but I’d be glad to learn that it’s mostly or all about saving mankind.

  • What happens if someone manages to exploit quantum entanglement communication which is at least 50.000 times faster than the speed of light? Isn’t retrocausal trade illegal?
    🙂
    But what’s the idea? The Li formula did it? It looked like a hammer so everything needed a pounding?
    Like most statistics the formula looks like it needs continuous recalibration.. I thought financial trading would be much more on their way with self-organized criticality or resonant scattering from signal processing or other models from more practical areas? I’m a bit surprised that both the Li function and Black-Scholes are more akin to 19th century thermodynamics.. that is more than fine, but it was a human act that overvalued the importance.
    I guess praying remains the superior option for the shortterm future..

  • Jacques Bayens

    Paul, no worry: quantum entanglement does not allow faster than speed of light communication, and never will. There are enough *real* causes for worry, no need to jump into a Philip Dick universe. But I agree, the maths invented by our quant “geniuses” are unlikely to justify any Fields medal.

  • (Greed + Leverage)^Hubris = Danger Will Robinson
    One of the keys is the principal/agency problem in banks. When you have people taking massive risks in complex, high velocity markets it helps ALOT if they have all (or at least most) of their skin in the game. Doesn’t mean some folks still won’t commit Hari-Kari but overall is very sobering – just the cooling rods such a system needs. Very few hedge funds blew up. And the ones that did, didn’t tip over the system. I believe this is because proportionally, the leaders of these organisations have 2-3 orders of magnitude more of their own personal capital on the line and no (or limited if you count management fees) free put… Given their relative size (even if you take the mega-funds as examples), senior bank executives would need to have literally billions of personal capital at risk to create a similar alignment. Clearly this is impossible.
    I know its heretical but ring-fencing the systemically important, utility aspects of retail and commercial banking from the deployment of risk capital is crucial. Don’t get me wrong, I’m not bashing the risk capital side of things – it’s super important, but to work properly the safety net needs to be a lot smaller and further away to discourage folks from doing back flips on the high wire…
    ps Chris, you really should get Disqus on your blog…

  • Jacques, true but there is a caveat to it.
    “speed of light communication” concerns classical causal exchange of physical information, but it is possible to exchange quantum information such as the polarization of a photon exceeding the speed of light.
    What you’re referring to seems to be about the fact that the two physical models do not conflict within the domain they are applicable to. Like Penrose i remain a solid fan of general relativity and think it is a ‘strong’ framework than quantum field theory.
    If you have a physics background, do check the papers by Joy Christian who has reformulated Bell’s theorem to fit in a world dominated by Clifford Algebra instead of arithmetic.
    Anyway, that’s not the point here, it seems we’re both surprised about the mathematical tools used. Hardware performance may be growing fast but human ingenuity remains the magic sauce and appears to be responsible for efficiency doubling every year (according to a recent PCAST report), and with crossfertilization among different knowledge domains there is no reason to assume a slowdown in this accelerated pace.
    I am much more interested in r&d for science and technology than in financial instrumentation so i don’t know the ins and outs of the latter, but i do continue to be surprised about how crude and slightly outdated the tools seem to be. I have no insight there, and i would assume the more appropriate maths are for internal use and focussed on datamining instead of trade activities.. but then, why not create a continuous feedback loop so that the probabilities are adjusted in near-real time?
    I can understand Finance as an industry is only exploiting only a few possibilities of a whole range
    (http://en.wikipedia.org/wiki/Twelve_leverage_points), but any senior computing scientist that i know within the financial world sees the falsity of these models, unless of course they have some kind of personal or professional bias such as career investments in advocating the illusion of infallibility… There are some great lectures by Gregory Chaitin on youtube on where mathematical logic breaks down.
    That’s not going to stop things though.. and why would it, because it is rather clear these formulas are rather selffulfilling, they become more true the more often they are applied.
    Just like Nash equilibria, which are so widely used in evolutionary modelling with game theory while Nash himself public denounced his early work due to their selffulfilling effect, and attributed them to his psychotic episode he suffered from at the time. Nevertheless many people continue to swear by it, while artificial life models or zoologist research from Frans de Waal clearly contradict the ‘selfish gene’ myth Dawkins and others so cherish.

  • Jacques Bayens

    Paul, please let me just point out that these papers by Joy Christian are highly speculative and controversial. See e.g. http://www.physicsforums.com/showthread.php?t=482657 for the sort of discussion they have triggered. Can you point me to any real experiment (not thought experiment) that would have successfully exchanged any quantum information faster than the speed of light in a vacuum ? Thanks.

  • Jacques,
    does controversy make it less viable? Did you read the actual papers? Did you read Bell’s actual papers?
    The reference was just for demonstration purposes that there are quite some interesting progressing insights in the nature of local reality and global reality. For example Gerard ‘t Hooft’s work addresses related issues, but eventhough classical and quantum reality seem to intermingle, causality is not violated.
    Alain Aspect’s work has been around since 1982, and i recall reading some in the late 80’s, early 90’s from the Max Planck Institute where polarization was transmitted. At that time they still thought it was instantaneous. A recent experiment is available on http://arxiv.org/abs/0808.3316
    Some more is available on http://www.quantumphil.org/history.htm
    The fact that it happens has passed the ‘trough of controversy’ about a decade ago..
    Next you’re going to tell me that quantum mirages aren’t true and that quantum teleportation can never happen.. 🙂

  • Jacques Bayens

    Paul: the controversy in this case illustrates that we are not talking about a widely accepted scientific ‘fact’. The experiment you mention does not *at all* prove that information can indeed be communicated faster than the speed of light. Here is a quote from the article about what it ‘proves’ (assuming nothing can be faulted about the experiment and its interpretation!):
    “Taking advantage of the Earth’s rotation, the configuration of our experiment allowed us to determine, for any hypothetically privileged frame, a lower bound for the speed of this spooky influence. For instance, if such a privileged reference frame exists and is such that the Earth’s speed in this frame is less than 10e-3 that of the speed of light, then the speed of this spooky influence would have to exceed that of light by at least 4 orders of magnitude.”
    I assure you that I am not out of touch by a decade with the research results. No breakthrough has been made recently. Nielsen&Chuang (“Quantum computation and quantum information”) state very clearly that “faster than light” anything is just not possible. I have seen nothing else than theoretical fluff ‘proving’ otherwise. So, the possibility that finance would someday make trades in the past is still as ruled out by physics as it ever was. We can focus on more practical issues. Sadly, no one has commented on my suggestion to just place a delay on the trades, although it is much much simpler to comprehend than quantum entanglement 🙂
    Cheers.

  • Jacques,
    Get a grip.
    The backwards in time comment was a *joke*, and as i tried make clear the results from quantum information theory and relativity physics are NOT contradicting each other.
    First you come with a physcis forum which is large populated with hobbyists and people advocating their own theories. Then you refer to a 675 page book, that allegedly proves your claim which does so in chapter 2.4.3 with a “theoretical proof” concerning the intermingling of the classical channel end quantum channel involved in “quantum teleportation” alongside a classical communications channel. This has indeed been proven several years ago where the quantum information flow acts as a sort of controler for the classical information flow.
    Does that match up or contradict the textbook, is it ok and is it fluff and proving your stance in a reductio ad absurdum way? I’m working on other things at the moment and don’t have much time to look into scientific articles which prove or disprove you stance. Do check out http://arxiv.org/find/all/1/all:+superluminal/0/1/0/all/0/1
    There are 889 papers using the word ‘superluminal’, i am terribly sorry that i don’t know them by heart. Either way General Relativity doesn’t say there is a maximum speed to events in the universe.
    Anyway, it seems to be a big issue for you not only being right but being acknowledged for being so. I really cannot say that you are, even if you insist on wasting more time on a futile discussion which unfortunately reminds me of Oliver Wendell Holmes idea of the “hydrostatic paradox of controversy”:
    “You know that, if you had a bent tube, one arm of which was of the size of a pipe-stem, and the other big enough to hold the ocean, water would stand at the same height in one as in the other. Controversy equalizes fools and wise men in the same way. And the fools know it.”
    Have fun
    Paul

  • Jacques Bayens

    Paul,
    I wonder why you are responding in this flippant tone. Did I hit a nerve ? Let us stick to the issue under discussion. The only claim I am making is that practical faster-than-light information communication has never been achieved and never will be *unless* a revolution in theoretical physics takes place. The number of hits on ‘superluminal’ is irrelevant and I’m afraid not a single one of these papers contains an experimental proof that can be repeated.
    You stated that “it is possible to exchange quantum information such as the polarization of a photon exceeding the speed of light”. Even that is not true in a general sense, as per the no-cloning theorem.
    Anyway, you don’t have to acknowledge I was right 🙂 Thanks for sharing some of your wisdom with your humble servant and self-professed fool.

  • Jacques,
    See how it works?
    Actually, it greatly annoyes me, and maybe you don’t get that told often enough? Your manner of discussing is closer to rethorical play then displaying scientific exploration, and i think such is meaningless unless one is into powerplay.
    Science is an ongoing progression of lessons and learning events, the speed of entanglement has only been determined 3 years ago, and has nothing to do with violating the speed of light, as indicated in the article and research paper.
    If i have learned anything that is that human ingenuity stops at nothing, neither does the universe. This is a subject matter to which there is no clear answer, and many teams are working on creating an answer and making things work in manners unforeseen. I haven’t bothered to make up the balance on what the state of the art may be if it accepted as a ‘fact’.
    Maybe there is no-cloning theorem, but what is happening with multiple exciton generation doesn’t seem to need it. No-cloning does allow for imperfect cloning, and that may be enough in some cases.
    Another area to look into is quantum corrals (http://arxiv.org/abs/cond-mat/0211607) with which IBM researchers have been able to ‘scratch’ circuits on a wavefunction, AND retrieve information from it. Usually that would involve a collapse of the wave function, but with some ingenuity they managed to avoid a reduction to classicality, similar like chlorophyll appears to stretch out the collapse phase to a duration several orders of magnitude longer.
    Also ‘t Hooft has been making great groundwork with alternative interpretations bypassing some of the limitations http://arxiv.org/abs/0909.3426
    But of course that is all theoretical fluff..

  • Jacques Bayens

    Paul,
    I have not implied that there wasn’t plenty of fascinating research on quantum computing, such as the areas you mention (thanks, will look into them). But as of today, I don’t think a single bit of information has been communicated at a speed faster than light in a non-controversial, repeatable experiment. Am I wrong on that ? I’m all for human ingenuity but I believe we should avoid hubris : man cannot actually change the “laws of nature” and the hard limits of today cannot necessarily be overcome tomorrow; for example, traveling backwards in time is impossible and always will be, no matter how much effort we put into it; same thing about avoiding death. So – and I apologize if that offended you – I tend to give a lot of pushback whenever somebody seems to place too much trust in future developments that are still highly speculative.

  • Most of the factors suggested have been around before. Faster trading has been around since the year dot, as amply demonstrated by when Nathan Rothschild bought the English Economy in 1815.
    Formulas don’t make a jot of difference to the big picture of how it works, they simply add a little bit more efficiency. Sure, that efficiency can change the face of a market (e.g., Black-Scholes, Li), but it doesn’t make the market. People make the market, and blaming Li’s formula for killing Wall Street is like blaming the machine gun for WWI.
    The question is not how the CDOs were priced, but why wasn’t the expected standard of care taken?
    The one factor that is different is this: securitization. This innovation is far-reaching, far more influential on the structure of finance than say Black-Scholes, which is sometimes cited as the 20th century’s biggest innovation. And that’s for one and only one reason: securitization removes the necessity of banking in today’s financial world. Now, money can be safely and profitably moved across from individual savers to individual borrowers through markets, not banks. For the first time in history!
    The crux of the difference is that banks, in participating in the market for CDOs, do not any more need to do banking. In the past, every loan they made they were responsible for, until paid off. Now, they are not, they don’t need to care for the quality of the loan because they can sell it.
    This is not a bad thing. It’s actually a great thing. It’s one of the most important developments in finance ever, it ranks up there with the inventions of double entry bookkeeping and the gold coin.
    But, while regulators, savers, and society at large fails to grasp the structural change in finance that securitization has reaped, that banks no longer need to care about banking, … we will see problems.
    Such as the GFC. Where the banks piled into crap loans as if they didn’t care. Because they didn’t need to, they could sell it all off on a market. All the rest follows…

  • Jacques,
    i think we agree there, because i don’t go for any backwards in time scenarios either. It started to feel like the EU ministers contradicting the rise of prices as “it is not inflation”.. No, it was a whole bunch of opportunists raising their prices because simultaneous with the introduction of the euro most agreements on ceiling prices were let go.
    Anyway, i read r&d reports every day, and have been since discovering email, but don’t keep track of it. When mentioning the speed of quantum entanglement the speed of light is only used as a yardstick to signify the order of magnitude, and assigning a large yet finite speed avoids mathematical/physical oddities such as “instanteneous”.
    As tried to explain entanglement seems to move faster, but it doesn’t violate causality. So, if we take an isolated situation entanglement may be moving at a greater speed when entanglement is ‘reprocessed’ again at the target all differences are evened out. So the quantum channel and classic channel appear to hold each other in balance. And this has recently been used to improve the accuracy of classical communication from some 80+% to 90+%, entanglement acts here as a sort of controller mechanism of guided decoherence. This is roughly the realm discussed in most textbooks, but it is more or less classical to quantum to classical. Which is great for teleportation purposes.
    The space for fiddling is about the size of a photon.. but then we get into strange setups as Bose-Einstein condensates which can compose a ‘super photon’ and other pairing and coupling phenomena which aren’t widely understood yet. The idea is to smear out an entangled photon and use that pick up any changes.
    But even if using a wiretap, which was shown several years ago in the UK to use resonance of an adjacent wire to copy about 40% of information flowing through a quantum cryptographic network.. what will happen if ten wiretaps are used and enough information is gathered to recompose a message. In most textbook samples the situations are highly simplified, and a communication channel is assumed to be like tickertape.. but that is not how interpretation works as we read word by word and the “rset can be a toatl mses and you can sitll raed it wouthit porbelm.” (which parallels Godel’s critique on Turing’s solution)
    Weirder things have happened at sea, but i don’t know how many years ago, i guess around 2005, an expirement was reported concerning a laserbeam directed at a supercooled gas, Argon or Xenon, and the lightbeam came out the other end just before actually hitting the one end. And then of course the lightbeam went in and out and rejoined with the ‘mirage’.
    Those are quite dramatic events though, i know of little research looking into Cherenkov radiation or even ways look inside a black hole.
    Then again… i know several senior scientists who worked at Lockheed, IBM or others, and much of the current research isn’t really public. Too much money involved.. But this is where most of the fun is at.
    Quite some work is going on considering re-coherence and group behaviour, the threshold in combinatorics in a spin network when the collective starts behaving like a manifold / continuous surface. Luckely some is public, such as the research from Austrian scientist Simon Gröblacher which has some highly intruiging consequences.
    Whether any of such breakthroughs have any significance for trade networks i don’t know, i would imagine not considering the conservation of causality within the local realism domain.
    I don’t think it matters that much.
    Anyway, maybe yes, maybe no, science and technology are caught in a positive feedback loop and many predictions from the 80’s and 90’s are actually being pulled closer to the now. It makes ‘laws of nature’ seem more like ‘habbits of nature’, and what were once thought as eternal transcendental rules are now being reconsidered if they can’t be better described as immanent evolutionary programs.
    Everything together, technology is moving so fast, set standards change so rapidly, and along with the usual market dynamics it will ‘disintermediate’ banks within the foreseeable future, unless banks change their model.
    Take care
    Paul

  • Dean Procter

    I simply find it easier to know the future. By inventing it. Wrap your algo around that.

  • Hmmm, so you’re behind Tony Blair being called on trial as one of Berluscommedia’s witnesses to ‘prove’ bunga bunga parties are really oldfashioned decent fun?
    i would wrap it in a self-sustaining attractor, but despite that i must say it’s a very good stance though. i like it.