Home / Fintech / The Birth of Computing and Development of the Web

The Birth of Computing and Development of the Web

The last chapter of ValueWeb talks about what comes after the third generation internet: the internet of value.  This is the internet being built today, based upon shared ledgers, cloud, apps, APIs and analytics.  Of course, the next generation is the internet of things.  But what comes after that?

There is an answer, but you need to understand the shape-shifting of technology to really absorb the state of today and tomorrow.  Therefore, this week I thought I’d talk about the past, present and future of the internet, beginning with Internet 1.0: Building the Web.

The origins of the web start with the beginnings of building computers.  I’m not going to linger on that too long as hopefully you’ve seen The Imitation Game with Benedict Cumberbatch in the role of Alan Turing solving the Enigma Code during the Second World War (although the Polish claim to have solved this a decade earlier).  Wars often stimulate progress – just look at aircraft design and development in the first and second world wars  to get an idea of how fast technology develops during wars – and the second world war invented computing.

This was the ENIAC – the Electronic Numerical Integrator And Computer – developed by the Americans to provide weather forecasting.  It was delivered in 1946 – as always with large computer projects in their early days, systems were delivered over time and over budget – and formed the origins of the first commercial computer company, EMCC.

After building the ENIAC at the University of Pennsylvania the inventors, J. Presper Eckert and John Mauchly, formed EMCC to build new computer designs for commercial and military applications. The company was initially called the Electronic Control Company, changing its name to Eckert–Mauchly Computer Corporation by the time it launched.

Eventually, their firm offered the UNIVAC, the UNIVersal Automatic Computer, which was the computer system used by NASA in the 1960s to get a man on the moon.  Bearing in mind Moore’s Law – computer power doubles every year whilst cost halves – those systems were pretty basic.  In fact, you have more compute power in your Apple Watch today than carried in the Apollo moon shots, which is why we’re now talking about colonizing Mars as a real possibility.

It was during this period that computer power in private companies began to take off, with a spray of other firms entering the fray.  IBM became the biggest of these firms – having purchased the Series 360 instruction set from an incredible inventor, Dr. An Wang, who went on to start Wang Laboratories with the money paid by IBM for the patents – and became the company known to buy from.  By the 1980s, the comment was that you never get fired if you buy IBM, and  the result was that lots of those competitors – DEC, Wang, ICL, Burroughs and Univac – all went by the wayside.  Interesting for a firm whose President originally dismissed computing as limited to a worldwide market of just five systems (although it is doubtful Thomas Watson actually did say this).

However, IBM did dismiss another rising technology as irrelevant – the Personal Computer, PC, operating system – even though they owned that space, as the first commercial PC manufacturer.  In fact, a little known fact is that it was Bill Gates’ mother Mary made Microsoft what it is today.

Mary Gates was one of the first woman to serve as a Director of a bank – the First Interstate Bank – and was later appointed to the Board of the United Way of America where she became the first woman to lead it in 1983.

Her tenure on the National Board’s Executive Committee helped her son’s company at a crucial time.  This is because, in 1980, she discussed with John Opel, a fellow United Way of America committee member, her son’s company.  John was the Chairman of IBM, and Mary told him that her son’s firm might be able to help the new business IBM was developing.  A few weeks later, IBM took a chance by hiring Microsoft to develop an operating system for its first personal computer.

The success of the IBM PC gave Microsoft a lift that made it the world’s largest software company.  Funny how things turn out.

Anyways, another giant of technology at the time, Ken Olsen who founded Digital Equipment Corporation (DEC), dismissed the PC, believing that “there is no reason anyone would want a computer in their home”.   This is the same guy who dismissed Unix as snake oil, even though, back then, he had founded and was running one of the largest computer companies of the 1980s (Olsen was forced out of DEC in 1992 and the firm was acquired by Compaq in 1998).  No wonder his firm went by the wayside, as mentioned earlier.

Meantime, other technologies were developing around the PC including Transmission Control Protocol (TCP), the Internet Protocol (IP), and modulators-demodulators, otherwise known as modems.

Meantime, there are several leading figures who were key to developing the modern internet, including Ivan Sutherland and Robert Taylor’s work on ARPANET, the Advanced Research Projects Agency Network, and Kevin Kelly’s work on The Well that led to the founding of Wired Magazine.  But the stand-out figure has to be Sir Tim Berners-Leee (yes, I’m British and biased).

Tim is viewed by many as the founding father of the modern internet, due to his development of the foundations that we use today: HTML, URLs and HTTP.

  • HTML: HyperText Markup Language. The markup (formatting) language for the web.
  • URI: Uniform Resource Identifier. A kind of “address” that is unique and used to identify to each resource on the web. It is also commonly called a URL.
  • HTTP: Hypertext Transfer Protocol. Allows for the retrieval of linked resources from across the web.

Tim proposed these in an October 1990 research paper at CERN, the European Particle Physics Laboratory in Geneva,  where he had been working since 1980.  The 1990 paper was an extension of what many consider the founding paper of today’s internet: “Information Management: A Proposal”, presented to CERN in March 1989.  Believe it or not, Tim’s initial proposal was not immediately accepted. In fact, his boss at the time, Mike Sendall, wrote the words “vague but exciting” on the cover. The web was never an official CERN project, but Mike managed to give Tim time to work on it, and that led to the breakthrough in 1990.

So the first generation of the modern internet was born in 1990, forty five years after the birth of computing.  Since then, each generation of the internet lasts about ten years.  In the 2000s, we saw Web 2.0; now we are developing Web 3.0, the internet of value; soon we will be entering the era of the internet of things, Web 4.0; and then, in the 2030s, we will be immersed in the Semantic Web, Web 5.0.

These are the five areas I’ll investigate in more depth over the course of this week, starting with Web 1.0: the first generation internet.

 

About Chris M Skinner

Chris M Skinner
Chris Skinner is best known as an independent commentator on the financial markets through his blog, the Finanser.com, as author of the bestselling book Digital Bank, and Chair of the European networking forum the Financial Services Club. He has been voted one of the most influential people in banking by The Financial Brand (as well as one of the best blogs), a FinTech Titan (Next Bank), one of the Fintech Leaders you need to follow (City AM, Deluxe and Jax Finance), as well as one of the Top 40 most influential people in financial technology by the Wall Street Journal’s Financial News. To learn more click here...

Check Also

The Finanser’s Week: 13th November – 19th November 2017

This week’s main blog headlines are … Old John Cryan had some code, AI, AI, …