A revolution in the sharing of knowledge…

Transforming e-Knowledge  
TABLE OF CONTENTS     Technologies, Standards, and Marketplaces for e-Knowledge
© SCUP 2003
back page   Page 76   next page    

Internet Infrastructures and Technologies (continued)


Chapter 4

Technologies, Standards, and Marketplaces for e-Knowledge

book image
Book Purchase

Case Studies
Contact us


Working at his desk, he verbally instructs his intelligent agent, a creation of software, to gather all the latest information on the use of distributed, ambient meeting environments among professional societies. He specifically asks for information on meetings of scientific societies, focusing on the topics of the session, the number of participants in different settings, length and nature of the interactions, learning outcomes, and relation to ongoing communities of practice on these topics. The agent accesses the digital library to refresh its knowledge of the semantics relating to ambient meetings and professional societies, then explores the Semantic Web, searching “tags” for the concepts it needs. The agent culls through the thousands of potentially useful examples, based on the explicit instructions from Elliott and implicit instructions drawn from past experience and searches by him and by other members of his community of practice. The agent reports its findings to Elliott, arrayed in preferred formats that have evolved from past searches.”

Making the Semantic Web Possible. For the Semantic Web to become transparent and ubiquitous in our lives in relation to e-knowledge, several things will need to happen. The standards and protocols supporting such exchanges will need to develop and achieve widespread acceptance. Repositories and marketplaces abiding by these standards and protocols will need to make bodies of knowledge available for exchange, repurposing, metering, and updating. Because of its richly interconnected semantic structures (such as embedding expertise with information), the Semantic Web provides the means to manage the ever-growing glut of information. Moreover, this capacity to use semantic structure to deal with content, context, and narrative will elevate the amount of expertise and learning tradecraft that can be communicated in exchanges involving learning objects.


But with this new infrastructure, new interfaces engaging users and the Web will need to achieve amenity and enable an entire community of users to be more efficient. And finally, users will need to develop both skills and habits of mind that enable them to seek and process knowledge much more effectively and more rapidly than today. They will need to be able to hone new skills in processing information about knowledge they are seeking and in experiencing the knowledge they have acquired. Their knowledge quests will likely involve far richer patterns of interactivity with other individuals and communities of practice, both in acquiring knowledge, communicating insights, and refining those insights.

The Grid

For some time, the research community has been looking at ways to link computers together, regardless of the distance between them, to create the equivalent of a single more powerful computer. This has progressed to the point where huge levels of computing power can be made available. For example, the US-based National Science Foundation (NSF) is developing a supercomputing grid that is scheduled for completion in 2003. It will be capable of performing 11.6 trillion calculations per second, all with a guaranteed quality of service.

A related development is the availability of software that enables groups of personal computers to tackle tasks that used to be restricted to supercomputers. Each personal computer works on just a small part of the overall task. The overall effect is to build an aggregated capacity that is equivalent in terms of raw computing power to a single, much larger computer but without a guaranteed quality of service.


Tapping an Underutilized Resource. Although the bulk of that work continues to focus on challenging research problems in science and engineering, the approaches and software developed by those researchers are of increasing relevance to the rest of us. For example, the NSF and the European Commission are collaborating in studies of how a Learning Grid might be established for widespread use. This offers the possibility of providing teachers and students with access to advanced computer simulations, of the kind that historically required a supercomputer, but at little or no cost to their institution. What makes this possible? The necessary high levels of raw computing power are available today but are not being used. They reside in over a billion personal computers in use globally. Over the course of a day, it is likely that each of those computers is on but not doing useful work (for example, its user may leave it unattended for a few minutes). At a global level, this represents a huge waste of resources, which can be overcome by linking computers via the Internet. As yet, few organizations have recognized or anticipated this in their administrative procedures, and through a lack of understanding of the possibilities and a fear of what might result if they allow linking of computers in this way, they are resistant to sharing their organization's “untapped” computer power.



back page   Page 76   next page

|  TOP  |