Jan 5, 2010

Cloud (net) computing

The author: Professor Yasser Metwally

http://yassermetwally.com


INTRODUCTION

January 5, 2010 —  Cloud computing is an emerging computing technology that uses the internet and central remote servers to maintain data and applications. Cloud computing allows consumers and businesses to use applications without installation and access their personal files at any computer with internet access. This technology allows for much more efficient computing by centralizing storage, memory, processing and bandwidth. Cloud computing is broken down into three segments: "applications," "platforms," and "infrastructure." Each segment serves a different purpose and offers different products for businesses and individuals around the world. In June 2009, a study conducted by Version One found that 41% of senior IT professionals actually don't know what cloud computing is and two-thirds of senior finance professionals are confused by the concept, highlighting the young nature of the technology. In Sept 2009, an Aberdeen Group study found that disciplined companies achieved on average an 18% reduction in their IT budget from cloud computing and a 16% reduction in data center power costs.

Video 1. Understanding Cloud computing

Cloud computing is Internet- ("cloud-") based development and use of computer technology ("computing"). In concept, it is a paradigm shift whereby details are abstracted from the users who no longer need knowledge of, expertise in, or control over the technology infrastructure "in the cloud" that supports them.  Cloud computing describes a new supplement, consumption and delivery model for IT services based on Internet, and it typically involves the provision of dynamically scalable and often virtualized resources as a service over the Internet.

The term cloud is used as a metaphor for the Internet, based on the cloud drawing used to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents. Typical cloud computing providers deliver common business applications online which are accessed from a web browser, while the software and data are stored on servers.

Video 2. Understanding Cloud computing

These applications are broadly divided into the following categories: Software as a Service (SaaS), Utility Computing, Web Services, Platform as a Service (PaaS), Managed Service Providers (MSP), Service Commerce, and Internet Integration.

  • Comparisons

Cloud computing can be confused with:

1.Grid computing — "a form of distributed computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks"

2.Utility computing — the "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity";

3.Autonomic computing — "computer systems capable of self-management".

Indeed, many cloud computing deployments depend on grids, have autonomic characteristics, and bill like utilities, but cloud computing tends to expand what is provided by grids and utilities. Some successful cloud architectures have little or no centralized infrastructure or billing systems whatsoever, including peer-to-peer networks such as BitTorrent and Skype, and volunteer computing.

  • Characteristics

In general, cloud computing customers do not own the physical infrastructure, instead avoiding capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Many cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services (such as electricity) are consumed, whereas others bill on a subscription basis. Sharing "perishable and intangible" computing power among multiple tenants can improve utilization rates, as servers are not unnecessarily left idle (which can reduce costs significantly while increasing the speed of application development). A side-effect of this approach is that overall computer usage rises dramatically, as customers do not have to engineer for peak load limits. In addition, "increased high-speed bandwidth" makes it possible to receive the same response times from centralized infrastructure at other sites.

  • Economics

Cloud computing users can avoid capital expenditure (CapEx) on hardware, software, and services when they pay a provider only for what they use. Consumption is usually billed on a utility (resources consumed, like electricity) or subscription (time-based, like a newspaper) basis with little or no upfront cost. Other benefits of this time sharing-style approach are low barriers to entry, shared infrastructure and costs, low management overhead, and immediate access to a broad range of applications. In general, users can terminate the contract at any time (thereby avoiding return on investment risk and uncertainty), and the services are often covered by service level agreements (SLAs) with financial penalties.

According to Nicholas Carr, the strategic importance of information technology is diminishing as it becomes standardized and less expensive. He argues that the cloud computing paradigm shift is similar to the displacement of electricity generators by electricity grids early in the 20th century.

Although companies might be able to save on upfront capital expenditures, they might not save much and might actually pay more for operating expenses. In situations where the capital expense would be relatively small, or where the organization has more flexibility in their capital budget than their operating budget, the cloud model might not make great fiscal sense. Other factors impacting the scale of any potential cost savings include the efficiency of a company’s data center as compared to the cloud vendor’s, the company's existing operating costs, the level of adoption of cloud computing, and the type of functionality being hosted in the cloud.

  • Architecture

The majority of cloud computing infrastructure, as of 2009, consists of reliable services delivered through data centers and built on servers. Clouds often appear as single points of access for all consumers' computing needs. Commercial offerings are generally expected to meet quality of service (QoS) requirements of customers and typically offer SLAs.  Open standards are critical to the growth of cloud computing, and open source software has provided the foundation for many cloud computing implementations.

  • History

The Cloud is a term that borrows from telephony. Up to the 1990s, data circuits (including those that carried Internet traffic) were hard-wired between destinations. Then, long-haul telephone companies began offering Virtual Private Network (VPN) service for data communications. Telephone companies were able to offer VPN-based services with the same guaranteed bandwidth as fixed circuits at a lower cost because they could switch traffic to balance utilization as they saw fit, thus utilizing their overall network bandwidth more effectively. As a result of this arrangement, it was impossible to determine in advance precisely which paths the traffic would be routed over. The cloud symbol was used to denote that which was the responsibility of the provider, and cloud computing extends this to cover servers as well as the network infrastructure.

The underlying concept of cloud computing dates back to 1960, when John McCarthy opined that "computation may someday be organized as a public utility"; indeed it shares characteristics with service bureaus that date back to the 1960s. In 1997, the first academic definition was provided by Ramnath K. Chellappa who called it a computing paradigm where the boundaries of computing will be determined by economic rationale rather than technical limits. The term cloud had already come into commercial use in the early 1990s to refer to large Asynchronous Transfer Mode (ATM) networks.

Loudcloud, founded in 1999 by Marc Andreessen, was one of the first to attempt to commercialize cloud computing with an Infrastructure as a Service model. By the turn of the 21st century, the term "cloud computing" began to appear more widely, although most of the focus at that time was limited to SaaS, called "ASP's" or Application Service Providers, under the terminology of the day.

In the early 2000s, Microsoft extended the concept of SaaS through the development of web services. IBM detailed these concepts in 2001 in the Autonomic Computing Manifesto, which described advanced automation techniques such as self-monitoring, self-healing, self-configuring, and self-optimizing in the management of complex IT systems with heterogeneous storage, servers, applications, networks, security mechanisms, and other system elements that can be virtualized across an enterprise.

Amazon played a key role in the development of cloud computing by modernizing their data centers after the dot-com bubble, which, like most computer networks, were using as little as 10% of their capacity at any one time just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" could add new features faster and easier, Amazon started providing access to their systems through Amazon Web Services on a utility computing basis in 2005. This characterization of the genesis of Amazon Web Services has been characterized as an extreme over-simplification by a technical contributor to the Amazon Web Services project.

In 2007, Google, IBM, and a number of universities embarked on a large scale cloud computing research project. By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them", and observed that "Organizations' are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to cloud computing ... will result in dramatic growth in IT products in some areas and in significant reductions in other areas."

In July 2008, HP, Intel Corporation and Yahoo! announced the creation of a global, multi-data center, open source test bed, called Open Cirrus [1],designed to encourage research into all aspects of cloud computing, service and datacenter management. Open Cirrus partners include the NSF, the University of Illinois (UIUC), Karlsruhe Institute of Technology, the Infocomm Development Authority (IDA) of Singapore, the Electronics and Telecommunications Research Institute (ETRI), the Malaysian Institute for Microelectronic Systems (MIMOS ), and the Institute for System Programming at the Russian Academy of Sciences (ISPRAS).

  • Political issues

The Cloud spans many borders and "may be the ultimate form of globalization. As such, it becomes subject to complex geopolitical issues, and providers are pressed to satisfy myriad regulatory environments in order to deliver service to a global market. This dates back to the early days of the Internet, when libertarian thinkers felt that "cyberspace was a distinct place calling for laws and legal institutions of its own".

Despite efforts (such as US-EU Safe Harbor) to harmonize the legal environment, as of 2009, providers such as Amazon cater to major markets (typically the United States and the European Union) by deploying local infrastructure and allowing customers to select "availability zones. Nonetheless, concerns persist about security and privacy from individual through governmental levels (e.g., the USA PATRIOT Act, the use of national security letters, and the Electronic Communications Privacy Act's Stored Communications Act).


1. "Cloud Computing: Clash of the clouds". The Economist. 2009-10-15. http://www.economist.com/displaystory.cfm?story_id=14637206. Retrieved 2009-11-03.

No comments: