History of the Cloud

This post was first created on 2011-Sep-01. 

Cloud is an evolving paradigm shift that will fundamentally change the way software and computing hardware technology is built, managed, delivered and consumed.

This paradigm shift is in an advanced stage of evolution. Over the next few years computing and software technologies will become highly ubiquitous, reliable and available almost like electricity, at the click of a button.  It therefore opens up significant opportunities for entrepreneurs and consumers to create and consume technologies in ways that could merely have been imagined until a few years ago.

To understand the evolution of the cloud it is important that we understand the evolution of any business or technology.

The cycle of business activity starts with innovation, followed by customization and product development before it turns into a utility based service.

Business evolution is therefore directed towards commoditization, whereby products take the shape of services and the impact of the product becomes wide and convincing. [12]


The history of the evolution of the Cloud can be best understood by looking at how computing and software has evolved over the past few decades:

 

Evolution of Computing:

 

Charles Baggage

'Another Age must be the Judge" - Charles Babbage - 1837

Credit: ComputerHistory.org - Charles Babbage - 1837

is generally credited with the invention of first automatic computing engine namely Difference Engine No. 2.

Credit: ComputerHistory.org - Difference Engine No. 2

The first working model of the DE 2 was not built until 2002

and is now housed at the Computer History Museum in Mountain View, California.[13]. You can also see a working version on the ComputerHistory.org website. Working Baggage Engine.

The ENIAC and the Atanastoff-Berry Computers of the 1940s can well be believed to be the first computers in the world considering a narrow definition of early computing which referred to any machine capable of electronic processing of steps like conditional branching and nested subroutines. [14]

Due to the ENIAC and ABC patent dispute, Computers as such were made un-patentable.

 

IBM Mainframes[15] with the O/S 360 and Microcomputers like the DEC PDP-8 were surely the next most significant computing advancements in the 1960s.

IBM OS/360  was a batch processing operating system developed by IBM in 1964

Credit: lbdsoftware.com - OS/360 Chart

Did you know you download OS/360 for Free and it can run on your home Windows, Mac or Linux computer![16]

Cray 1 the world’s first supercomputer was invented by Seymour Cray[17] in the late 1970s at almost the same time as the birth of ‘Personal Computing” with the Apple II followed shortly thereafter by IBM PCs in the early 1980s.

 

The 90s saw significant advancements in the areas of personal mobile computing (e.g. Palm Pilot) and the provision of large scale computing services that powered services like email (e.g. Hotmail and Rocketmail) and search engines (e.g. Altavista and Yahoo) which was fuelled by and fuelled in turn the growth of the internet.

 

Internet Diagram

Credit: Internet - Made by Bob Hinden of Bolt, Beranek and Newman, for the US government.

 

The ability to continually miniaturize and add more transistors on Silicon[18] enabled us to go from one transistor on a chip in the 1950s to about six hundred million transistors on a 64 bit Microprocessor in early 2000.[19]

Moore's Law

Credit: http://www.ausairpower.net

 

Moore’s law describes a long-term trend in the history of computing hardware. The number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years. This trend has continued for more than half a century and is expected to continue until at least 2015 or 2020

 

Evolution of Software:

The earliest form of Programming in the modern world can be thought of as Jaquard’s loom that could be operated using punched cards, the same technology that was later adapted by IBM for their first computers.

Jacquard Loom

Credit: Picture of such a programmable Jacquard loom by George P. Landow

Jon Von Neumann in 1940s came up with the concept of shared program technique and conditional control transfer which led to the EDVAC and can be considered as the first example of modern programming. [20]

Conditional Control Transfer is a simple concept that says you can use simple subroutines instead of one large program. This allows the programmer to use the same subroutine to perform multiple functions depending on the machine, context or inputs. This cuts down the number of lines in a program and makes it more efficient. The concept of subroutines finally evolved into what we commonly know as Software Libraries.

The term GL stands for ‘Generation Language’ and is generally used to denote the generation of the programming language.

Software programming has evolved over the past five decades through five generations of programming languages every time moving to a higher level of abstraction so as to reduce the amount of programming effort and time required to create software.

GLs

Credit: Watalon.com Generation of Languages GL1 to GL5

 

GL Examples

Credit: Watalon.com Examples of Languages from each Generation G1 to G5

 

James Gosling invented Java which was released in 1995 which enabled software to be truly free from the underlying hardware through the use of an intermediate layer called the Java Virtual Machine.

 

Credit: geekinzer.com blog post by Tarandeep singh

 

Evolution of the Cloud:

For those of us who are familiar with the Watal’s Innovation Curve, we have used a derivative of the same curve to explain the innovation curve that has led to the formation of the Cloud.

Watals Innovation Curve for the Cloud

Credit: Watalon.com - Innovation Curve of the Cloud

Phase 1 Innovation: Computing as a field emerges

The way to read this curve is to start at the bottom. Computing can be thought of as the first innovation e.g. the Difference Engine 2 which had a single use as a machine used for computation.

Phase 2 Customization: Software and Hardware become distinct layers

The base concept of the innovation, i.e. the computing machine then broke down into more discrete layers of computing software and computing hardware with the Jacquard Loom which had Punch Cards a mechanism to Program (Software) and the Hardware (the Loom). Or the OS/360 Mainframes and the Programming Languages used with it.

Phase 3 Productization: Packaged Software and Hardware

Over time the Hardware became more discrete. With the advent of Personal Computers the need for bundling and packaging led to what were commonly known as Software and Hardware Products. For example a product called a Personal Computer was made up of  components like the CPU, RAM, Hard Drives and Floppy Drives. Software similarly became a packaged set of Libraries e.g. the Windows Operating System is a set of packaged DLLs (Dynamic-Link Libraries).

Phase 4 Utilitization: Computing becomes a daily necessity and is now available on the tap like electricity

Computing and Software have evolved into finer and discrete layers over the past two decades with almost all the layers allowing for some level of Virtualization, a mechanism to allow resources to be shared rather than dedicated.

While Cloud can be looked at from many perspectives including that of the end user who consumes all the services like Web, Mobile, Video, Messaging, Documents using a massively scalable service. For the purposes of our discussion we will look at it purely from a technological evolution point of view.

For those of us who saw the evolution of the internet from its early days till today would be very familiar with the chart below that captures a more recent set of steps that led to the evolution of the Cloud.

Cloud_evolution_near_term

Credit: Watalon.com - 1995 to 2011 Evolution of the Cloud

The rapid growth of computing in the 1990s coupled with the Dot Com boom and the Y2K effect led to a significant growth in computing hardware resources which almost suddenly became underutilized due the economic downturn at the turn of the millennium. This created a need for server consolidation which led to the advent of Virtualization.

The largest Virtualization trend is Operating System Virtualization. In the initial days OS Virtualization meant para-virtualization which involved installing a Hypervisor[21] on top of an operating system to partition it and then installing multiple guest operating systems on each of those partitions.

The multiple software layers therefore took away a significant portion of the economic benefit of server consolidation. This led to the advent of Type 1 Hypervisors like Hyper V from Microsoft which runs directly on the hardware.

Types of Hypervisors Type1 and Type 2

Credit:- Types of Hypervisors - Author - Scsami

 

Operating System Virtualization usually means para-virtualization which is valuable if your main objective is server consolidation.

Virtualization as a trend has proliferated across higher layers of the software stack including Databases and Applications.

 

Virtualization

Credit: Watalon.com - Virtualization is no longer limited to the Operating System

 

Further Virtualization has enabled the decoupling of the different software and computing layers. This loose coupling along with interoperability led to the ease of packaging and bundling of services using different combinations of these layers thereby providing solutions like Infrastructure as a Service (IaaS), Platform as a Service(Paas) and Software as a Service(Saas) which can be thought of as Packaging of the various fragmented layers of computing and software.

The past few years have seen significant improvements in data storage.

No SQL data stores like Cassandra have no theoretical limit on data size. Today there are non-sql data stores like MongoDb that allow entire files to be stored inside the database.

Cassandra is an open source software created by Facebook employees Avinash Lakshman and Prashant Malik. Below, Avinash Lakshman former Facebook employee presents NOSQL – Cassandra from martind on Vimeo.

Reliability has improved with service providers willing to offer more than four nines availability owing to technologies like auto provisioning and on demand compute and storage.

Computing and software are now close to an epochal point where they can become a true utility.

Connectivity and bandwidth also grew along and over the past decade. Wireless and mobile devices supporting email, messaging, and other web-based applications too became prolific, enhancing network utilization. The World Wide Web transformed from a simple communication mechanism that allowed us to retrieve data (Web 1.0) into a medium providing high definition interactive data sharing and collaboration (Web 2.0).

This also signifies a complete shift of mindshare of the users. People have accepted web applications in their day to day lives and mobility to communicate becomes vital need for all. Thereby creating the need for Cloud as a Utility.

This acceptance by people fuels demand for more and more computing and software. Maturity in the computing and software technologies further fuels supply and is driving it towards total ‘ubiquity and certainty’ in the words of Simon Wardley. Cloud is the paradigm shift that is being created as computing and software pass through this vortex of rapidly growing supply and demand into a mature utility.

Origin of the term Cloud

It is important to understand the use of Cloud metaphor. The relevance of the term cloud can be best understood by looking at it as an interface between technology and its users. The users work on their computers, desktops or laptops or other mobile devices and connect to a single network which connects all applications and data. This network is called a Cloud, based on a Cloud drawing used in the past to represent a telephone network.[22]

 

  1. [12] Video excerpts from 2010 OSCON Presentation: Situation Normal Everything Must Change, a presentation by Simon Wardley, Researcher at the Leading Edge Forum or CSC.
  2. [13] Babbage Engine Tour at the Computer History Museum.
  3. [14]  Due to the ENIAC and ABC patent dispute computers as such were made un-patentable. ENIAC website: http://the-eniac.com/first/ that discusses the details of whether ENIAC was the first computer or the Atanasoff – Berry Computer(ABC).
  4. [15] The IBM Dictionary Of Computing defines “mainframe” as “a large computer, in particular one to which other computers can be connected so that they can share facilities the mainframe provides (for example, a System/370 computing system to which personal computers are attached so that they can upload and download programs and data). The term usually refers to hardware only, namely, main storage, execution circuitry and peripheral units.”
  5. [16] OS/360 is in the public domain and can be downloaded freely.
  6. [17] Seymour Cray the father of Super Computing Video
  7. [18]  Transistor density on integrated circuits doubles every two years as per the Moore’s Law
  8. [19]  Moore’s Law : Raising the Bar – Intel Corporation 2005
  9. [20] Read here to learn more about how Conditional Controls Transfer eventually evolved into what we commonly know today as Software Libraries
  10. [21] A software one level higher than the Supervisor.(The part of the OS that directly controls the hardware).
  11. [22]  Writing & Speaking”http://www.sellsbrothers.com/writing/intro2tapi/default.aspx?content=pstn.htm 

Other News

  • Big Data Cloud Computing Innovation Cloud Standards

    Cloud Standards

    The Cutter IT Journal August Issue carries an paper written on CARMA. The paper is a sneak preview of the content you can expect in the Cloud Standards book when it is out this winter. You can download a Free Copy of the Issue. If you missed our earlier conversations on Cloud and CARMA. I am in the process of writing my second book Cloud Standards, which is due later this year as part of the Cloud series, The Paradigm […]

    Read more →
  • Uncategorized Clearing the Cloud by Air India

    Clearing the Cloud by Air India

    This is to interested CEOs and CIOs in India who visit this Blog, you can now read about the Cloud when flying at 36,000 feet in Air India. In the past few months I have been talking to various CIOs and CEOs about the current state of the Cloud. What is most interesting to note is that despite so much buzz around the Cloud there is still very little clarity as to what the Cloud is all about and what […]

    Read more →
  • Big Data Cloud Computing Big Data Simplified – Part 3

    Big Data Simplified – Part 3

    In the 1st Part of this 3 Part Post we tried to understand the question : What is Big Data? Then in part 2 of this post we tried to understand the Issues with Data and the Evolution of Technology around Data. Now in this 3rd and final part of the Post we try to understand the Importance of Big Data and why Businesses should care about Big Data. Impact and Importance of Big Data on Business There are many […]

    Read more →
  • Big Data Cloud Computing Big Data Simplified – Part 2

    Big Data Simplified – Part 2

    In Part 1 of this 3 part post we tried to understand if not answer the question: What is Big Data? In Part 2 of this post we shall try understand the Issues around Big Data or Data more broadly. We shall also try and understand how technology has evolved around Data, Databases and what this means in the Big Data context. Issues with Data? Need for faster access to information, storage for larger and larger volumes of information and […]

    Read more →
  • Big Data Cloud Computing Big Data Simplified – Part 1

    Big Data Simplified – Part 1

    So, if you read the book (Cloud Basics – The Paradigm Shift) I had said it was really mean’t to be a Blog Post or a Series of Posts that ended up becoming a book. As, I write my first post on Big Data I can clearly see the need for a Book or maybe even a series of Books here that can help clear up the mist on Big Data. More over there is a need to clarify what […]

    Read more →
  • Cloud Computing Types of Private Clouds

    Types of Private Clouds

    Private Clouds are in the News. AT&T announced yesterday(Monday, 13th Feb 2012) its Virtual Private Cloud offering. It is interesting to see terminology evolve in the Cloud world. Am sure we have heard the terms Public Cloud, Private Cloud, Hybrid Cloud but maybe not Virtual Private Cloud. Telcos are big proponents of the Cloud. It is often argued that the Telcos stand to be the biggest gainers from the increased Internet Traffic and hence their desire to create new terms. […]

    Read more →
  • Business Finance Management How to get valued like Facebook, $100 Bn?

    How to get valued like Facebook, $100 Bn?

    Is Facebook worth $100 Billion? was a post made by the Wall Street Journal a few months ago which carried an interesting perspective from Geoff Yang, Founding Partner with Redpoint Ventures, Menlo Park. Geoff gave an interesting perspective on how Facebook could be valued based on its potential market share of the Ad market and the projected growth of the market by 2015. You can read the full post here. (If you know how to get past the Pay Wall […]

    Read more →
  • Health & Wellness Life Uncategorized Wisdom Do Blackholes exist on Earth?

    Do Blackholes exist on Earth?

    Blackholes exist right here on earth. What is a Blackhole?: Light goes in and is never reflected back to the source or sender. It is believed that this happens because Blackholes have a high amount of gravity and hence pull in anything that is thrown towards them. At least, so says the Theory of General Relativity. Then there is another theory called Quantum Mechanics that says Black holes are black bodies(no kidding) hence they must emit heat radiation like any […]

    Read more →
 

Please log in to vote

You need to log in to vote. If you already had an account, you may log in here

Alternatively, if you do not have an account yet you can create one here.