A revolution in the sharing of knowledge…

Transforming e-Knowledge
TABLE OF CONTENTS     Technologies, Standards, and Marketplaces for e-Knowledge   © SCUP 2003
back page  Page 77  next page    

Internet Infrastructures and Technologies (continued)

 

 


Chapter 4

Technologies, Standards, and Marketplaces for e-Knowledge

book image
Book Purchase

Home
Index
Case Studies
Glossary
Bibliography
Contact us

   

Harnessing the latent power of distributed, interconnected computing systems and building the aggregated capacity of “virtual organizations” is the vision of scientists and researchers of so-called Grid technologies. It is also the vision of less community-minded individuals and organizations, which periodically look for ways to exploit the trust that underlies the willingness of individuals and organizations to link their computers to the Internet. For example, in 2001, the US Internet service provider Juno offered free Internet access to its four million subscribers. How many of them would have checked the small print that entitled Juno to free use of their unused processor power? Juno proposed to rent out that processor power to biotech companies.(www.biotech.about.com/ library/ weekly/ aa_juno.htm) And in 2002, users of peer-to-peer file swapping were offered a new program, Kazaa, free. In the terms of use, which not all potential users read carefully, was a clause granting the right to make use without compensation of unused processor power and also unused storage space. As with Juno, the aggregated power was to be rented out. Many of us access the Internet from home and from work. In the latter case, by agreeing to such clauses, we may be allowing free use of all the processors and storage on our organization's network. Few network managers would be happy with this.

The Threat of Parasitic Computing. Even more controversially, ways now exist to gain access to your processor power with-out telling you. An example is known as parasitic computing, in which “servers unwittingly perform computation on behalf of a remote node. In this model, one machine forces target computers
to solve a piece of a complex computational problem merely by engaging them in standard communication.” (Vincent Freeh, University of Notre Dame. www.nd.edu/~parasite).

 

This works by ingenious use of the standard set of protocols that ensures reliable communication on the Internet and in most private networks. Current implementations of parasitic computing are not efficient, so at present we need not worry. But this has the potential to transform the Internet. While Professor Freeh and his colleagues have developed ways to spot parasitic computing, he points out that “its existence raises important questions about the ownership of the resources connected to the Inter- net and challenges current computing paradigms."

It should be stated up front that while “the Grid” is handy common terminology for the ultimate supercomputer spanning the entire globe, there are actually many lesser grids being developed, some of which are targeted at e-knowledge and e-learning. In a way, the same can be said of the Web.

TeraGrid is a cooperative effort “to build and deploy the world’s largest, fastest, most comprehensive, distributed infrastructure for open scientific research. When completed, the TeraGrid will include 13.6 teraflops of Linux Cluster computing power distributed at the four TeraGrid sites, facilities capable of managing and storing more than 450 terabytes of data, high-resolution visualization environments, and toolkits for grid computing. These components will be tightly integrated and connected through a network that will initially operate at 40 gigabits per second and later be upgraded to 50-80 gigabits/second—16 times faster than today’s fastest research network.”

www.teragrid.org

A number of important initiatives aimed at standardizing Grid efforts include the Globus Toolkit™, an open-source suite of standard protocols that serves as reference implementation architecture (that is, best practice guidelines) for a variety of e-science initiatives, and the Global Grid Forum.

     
     

back page   Page 77   next page

|  TOP  |

 

  Close to a decade of focused R&D and experimentation has produced considerable consensus on the requirements and architecture of Grid technology.
Ian Foster