Information systems (IS) are quickly emerging as critical resources to be leveraged for organizational productivity in many business, social, and economic enterprises. The explosive growth in information technology (IT) can be broadly attributed to the emerging novel linkages of IS/IT with several base disciplines, extending the reach of IS/IT to application domains never previously considered.
In this article, we focus on certain important and promising IS/IT frontiers identified from the perspectives of academia, industry, and federal research funding agencies. Our objective is to focus the collective awareness of the IS community and those in related disciplines on some of the frontier developments in IS/IT with a vision of the road ahead and point to challenges and opportunities [1].
We address the following emerging frontiers in IS/IT: the integrated circuits (IC) revolution, object technologies, knowledge and distributed intelligence, the evolving multiagent intelligent systems, satellite networks and mobile computing and the virtual corporation of the next millennium. These frontiers capture significant areas of research and development with the potential to fundamentally alter the field of IS/IT in the next millennium.
One of the frontier areas of IS/IT and its interfacing disciplines is fundamentally based on the hardware background: the semiconductor technologies and systems. This is an area where the cross-disciplinary impact of information science and technology has been substantial. A comprehensive, yet simple perspective on this impact is the feedback cycle shown in Figure 1. The proactive elements of this loop basically start with some hardware, primarily the ICs. Combined with software, the collective impact of the two elements is greater than those of the individual parts. This synergy essentially fuels further growth in both the elements with a positive feedback. Consequently, the overall growth in the enhanced integration of the elements is exponential, leading to what we term as the generalized Moore's law. Essentially, this generalized law implies that exponential growth of a performance parameter on the combined hardware and software elements is observed over time. Figure 2 illustrates the growth rates of nine critical performance measures [2].
The number of transistors integrated per chip has grown exponentially over time. Similarly, the computing power of the microprocessor, hard disk capacities, and PC file sizes register exponential growths. All these phenomena illustrate the generalized Moore's law. The semiconductor fabrication costs have also grown exponentially. Finally, the growth rate in telecommunications traffic is much faster than exponential, which is a very significant difference.
The costs of storage using hard disk drives and CD-ROMs are also decreasing exponentially, and software in high-end consumer products exhibits roughly exponential growth. All of these together imply that in the last 25 years, technology has advanced by about 5 or 6 orders of magnitude. As a corollary to this evolution, it is reasonable to expect that human endeavors have also grown correspondingly in terms of the needs for computing, storage, and rapidity in information processing. While the growth here is significant, it is not exponential as in the case of the hardware technologies. This leads to an apparent hypothesis that society is so overwhelmed with such technological developments that the technical fabric will eventually collapse due to lack of parallel growth in the human enterprise. However, neither the technology has so far crashed nor the human enterprise ended. We offer the following explanation of this paradox. The parameters for usefulness of a technology can be described as a logarithmic function of the growth in the technology. A simple example is used to illustrate this point. A telephone directory in a PC today essentially covers those most relevant phone numbers of people a user interacts with. The size of such a directory could probably range from a few kilobytes to a few megabytes at most. It would probably contain 95%99% of the numbers the user is likely to call, but if this has to be brought to 100%, then it would be necessary to list every individual that exists on earth. This implies that many orders of magnitude of the technology capacity and memory size higher than what we have today would be needed. Hence, even though the capacity of technology has gone up by many orders of magnitude, the convenience level of its usefulness grows only slightly because of its compressive nature. The exponential growth in technology and the logarithmic improvement in usefulness as technology advances together lead to a linear progression of usefulness.
In summary, from the point of view of the semi-conductor industry, the limitations in technology adoption are not really in the technology factors at all. These limitations arise due to an inverse Moore's law, predicting a logarithmic growth in the usefulness of technology. The major limiting factors are economics, availability of skilled manpower to use and support these technologies, complexities in validating the technologies and the management of these complexities, to name a few. However, innovative ways to circumscribe these limitations are emerging as the frontiers of IS/IT applications.
Object technology has been widely acclaimed as a revolution in business computing that will resolve many of the problems inherent in information systems development and management [5]. This is consistent with the Object Management Group's (OMG) vision of the future: a common standard for objects enabling the development of purchasable, sharable, and reusable information assets existing in a worldwide network of interoperable interorganizational information systems.
Object orientation is a different philosophy for (1) software and data organization, (2) systems analysis and design, (3) information resource management, and (4) information sharing. The object metaphor is powerful in all four of these areas. First, this metaphor has a significant potential for revolutionizing the way we think about developing, managing, organizing, and sharing software resources. Encapsulation, code reuse, sharable frameworks, and using patterns to increase development productivity all have both realized and potential benefits. Second, this metaphor only works, if in fact, it is possible to identify through analysis and design techniques, an appropriate set of objects, classes, and capabilities that can be shared across multiple business applications. The development of object frameworks and design templates [3] begins to address this issue, but significant work remains before such techniques can be widely applied. Third, to enable such sharing, objects must be managed within the context of the organization. Object management must combine data administration and software development functions. These have traditionally been separated organizationally, and their combination may cause significant management challenges. Finally, as a philosophy for information sharing, the paradigm of distributed objects implies that an object can exchange messages with any other object, independent of any locational or technical considerations. The distributed objects together with Web technologies yield radically new approaches to remote connectivity and distributed resource management in the emerging IS/IT systems. While these are realized to some extent, they are the intrinsic potential of object technology.
Each aspect of the object metaphor has significant technological and organizational barriers to overcome before its potential benefits can become reality. Technical barriers include issues such as object identity, persistence, location, security, standardization of object capabilities, class names, messages, and protocols. Organizational barriers include issues such as object ownership and management, training, policies for reuse, object acquisition, development and sharing, cost-benefit analyses and migration from existing technologies. Consequently, object technology has not yet matured to the level of the effective paradigm that it was envisioned to be.
Six fundamental developments are needed in object orientation. First, organizations will need to globally cooperate and develop a common business vocabulary in order to achieve plug-in compatibility, messaging, and distributed and interoperable information systems with objects. Second, objects must be managed. Simply using an object-oriented programming language or database management system without effective object management will result in the same problems that exist in current IS organizations: redundancies in object implementations, object capabilities and data, leading to development and maintenance nightmares. Third, object management integrates data administration and application development. New IS organizational structures must be developed that include roles such as object librarians, reuse managers and object managers. Fourth, the object repositories need to be standardized into application development environments, replacing the plethora of case tools and representations that currently exist. Fifth, standardization of security, naming, messaging, object identities and locations across global information networks is essential. Finally, capital investment in object technology is absolutely critical to realize its full potential.
The knowledge networks, as envisioned by the National Science Foundation (NSF), represent the next generation networks of telecommunication systems, intelligent databases and collaborative technologies, and high-performance computing platforms. The vision of NSF in this direction yields a convergence of research, development, and practice [4]the NSF vision and implications include:
NSF has carried out considerable groundwork in these areas. Some of the major initiatives are the gigabit testbeds program, NSFnet, high-performance connections program, cross-agency initiative on digital library research, networking infrastructure for education, and the learning and intelligent systems initiative. As a result, several research threads have been identified and promotedsome of the major emerging research threads on knowledge networks include:
Knowledge networks are vehicles of change. The increased speed and bandwidth in data communication and the masses of information transacted yield tremendous economies of scale. Integration of heterogeneous data and the expansions in media/knowledge domains and applications are leading to significant economies of scope. Together, combined with a new generation of users, the evolving information infrastructure will yield economies of change toward greater productive environments. However, several critical barriers should be overcome before this vision becomes a reality. Some of the major barriers are:
These barriers, however formidable, should be converted into significant research opportunities. First, the processes and dynamics of distributed intelligence are a critical research frontier. This area includes: computational aspects such as knowledge aggregation, process coordination, and so forth; cognitive aspects such as collective learning and teamwork; and dynamic processes, adaptation and evolution of knowledge networks. Second, dealing with heterogeneity and interoperability in knowledge networks is an important problem. Computational foundations, active distributed sensing and computational/organizational infrastructures are the major research issues. Third, the computational infrastructure itself leads to several specific issues with bounded scopes such as secure network architectures, large scale remote data acquisition, and distributed knowledge. Finally, the development of prototype knowledge networks, empirical assessments, development of knowledge dissemination processes for sustainable use and the social integration of knowledge networks with a wide range of user communities are major research opportunities.
The research and innovations that will emerge in these and other frontiers will ultimately enable Marshall McLuhan's vision of the global village.
Autonomous intelligent agents and multiagent technologies are the cornerstones of an emerging era in computing and communication. Intelligent multiagent systems for a variety of traditionally human endeavors such as planning, negotiation, decision making and cognitive associations are virtually redefining the ways in which the communication and computing infrastructures are used worldwide. The research issues arising in this area are several, and span integrative developments cutting across several disciplines such as cognitive science, computer science, telecommunications engineering, and numerous end user domains. In particular, some of the major developments include frameworks for distributed intelligence, representation and reasoning, models of perception, learning and adaptation, and architectural designs for multiagent systems supporting a wide range of human enterprise. We address some of the important emerging research frontiers that have considerable theoretical attraction as well as practical significance in the following discussion.
Basic research on representational and modeling issues such as formalisms, reasoning and negotiating models, human-agent interaction and agent learning and adaptivity is laying the foundations of higher level constructs and theories such as cooperation, coordination, conflict detection and resolution. Some of the functionalities embedded in these emergent technologies include ideas such as computational behavior of negotiating market agents, swarm intelligence leading to cooperative learning and communication and gaming strategies. Potential areas where intelligent agents will play constructive and pivotal roles in the future will include hybrid human-computer systems such as computer supported cooperative work, business applications, field intelligent systems performing information gathering and analysis and computer gaming in virtual environments such as simulator testbeds in training systems. Furthermore, the major research initiatives both in the recent past and current have given rise to several frameworks, languages and software systems for intelligent agent development. While these technologies are in various stages of development, some of the state-of-the-art frameworks are:
Albeit the rapid advances in this area, there are also several formidable research and development challengeswe outline some of these here.
The first is the ability to augment the human intelligence with extended agent support. This task is compounded by the complexities in modeling human behavior and human decision making. The research in these areas should address multiagent architectures and perceptual modalities and systems from a learning and adaptation perspective. The second challenge pertains to the reduction or management of the complexity of intelligent systems. This includes difficulties in modeling and managing real time uncertainties during agent operation. The third is the development of efficient man-machine interfaces such that human augmentation is well integrated. In this case, issues of sensing and control, specification verification, robustness of hybrid systems and agent/human interaction become critically important. In order to cope with these issues a wide variety of tools of computation, augmentation, and analysis have to be developed. Together, these challenges provide developmental opportunities in modeling cognitive systems, high-speed computations, interagent connectivity, standardization of agent technologies, and multiagent architectures for a wide range of autonomous applications.
Peter Drucker observed that "Today wealth is no longer defined as land, money and possessions. It is knowledge and knowing how to use it." Knowledge is fundamentally based upon information and the ability to move it from one place to another. Some of the prime movers of information in the telecommunication era are the satellite networks. In particular, the emerging satellite technologies are enabling people to compute and communicate from mobile platforms, and the new paradigm of nomadic computing is most likely to be the next significant milestone in the ongoing IT revolution. We envisage that satellite and cellular enabled mobile computing will become common modes of computing and communication in the next ten years, both from individual and organizational points of view. The growth of these two technologies in terms of users and traffic has been phenomenal. For example, the cellular industry is experiencing explosive growth with over 40 million subscribers only in North America by 1997, and the global mobile data usage has been estimated to reach 5.2 billion subscribers by 2000. Several technologies geared in this direction are at various stages of development, and together, are likely to influence the IT world in the early part of this century. In this section, we focus on satellite technologies, and the Celestri system of Motorola in particular.
The escalation in the need for wideband communications has led to an increasing demand for a global information infrastructure to serve major corporations, small businesses, individual residences, and governments. While fiber optics promises to be effective for point-to-point applications, estimates predict that 25 years and $1 trillion are needed to connect the world with fiber optics. The Celestri fixed service satellite system is intended provide wideband global point-to-point and multipoint service operational in 2002. This will be done with both geosynchronous earth orbit (GEO) and low earth orbit (LEO) satellites to provide the lowest cost optimized wideband service for direct to home, small business, and corporate terminals. The suitability of GEO or LEO satellites for wideband communication depends largely on the applications involved. For example, delays in GEO communication are too long for interactive use such as voice telephone or video conferencing. Further, the greater distance to GEO satellites also requires more power and increased antenna complexity. Hence, users desiring small terminals for interactive applications may be better served by LEO communications. Broadcast, on the other hand, is usually best from GEO satellites, because their locations with respect to the earth are fixed. As a result, hand off and tracking are not needed. Addressing these specific needs for both GEO and LEO satellites, the Celestri system provides a composite of three parts: Celestri Multimedia LEO, Celestri GEO, and Celestri Trunking LEO. Together, the GEOs and LEOs point to a growing area of research and developmental activities in the next decade. More specifically, the research frontiers in mobile computing include:
While the preceding frontiers address specific facets of the rapidly advancing IS/IT field, a relatively silent revolution has been developing in the industrial sector: the virtual corporation. A classic example of such a virtual enterprise is Verifone, which has remarkably combined the available tools of information technology with an aggressive strategy to realize a virtual organization of extraordinary success. The following analysis is centered on Verifone experiments with virtual business operations, and integrates our previous discussions on the emerging frontiers within the broad perspective of the virtual corporations of today and tomorrow.
Verifone is a worldwide transaction automation company founded in the early 1980s. When Verifone started in 1981 with a negligible investment, it had to face the competition of industrial giants like IBM, AT&T, GTE, and Matsushita. As a first step, Verifone decided that it had to do something very different in order to survive. To this end, Verifone turned to information technology to attain a vital competitive edge. All the competition was very centrally located and was centrally serving their customers. When powerhouses like Matsushita or IBM came out with a solution to a customer needs, they had to go back to their headquarters for all aspects of product development and delivery. This caused significant delays in responding to the customer. Realizing this, Verifone decided to organize into small yet complete functional groups, locate such groups to be as geographically close to the main customers as possible, and then use information technology to tie the company together.
The decision to decentralize the corporation is based on the philosophy that: technologies should not just be technologies, but be a glue to bring together groups as well as put together virtual teams around the world, allow them to go and solve particular problems for customers, and then disband. In this process, the technologies should be supported with people, and together should find ways to keep remote employees motivated, counter the fear of "not being in the action zone" and provide an integration of the Technologies (T), Organizational structure (O), Mindset (M) and Environmental focus (E). Verifone refers to this philosophy in terms of the formula: Success = T+O+M+E. The TOME formula has been expanded to a seven-point companywide philosophy, as summarized here:
The philosophy embedded in OBM is that successful corporate leadership implies being business leaders first and technologists next. The real role of leadership is not only to use technology to link people in the physical link sense, but more importantly, to help to build a company and keep it together. This is particularly true with Verifone, because its employees are much less cohesively tied to the corporate entity than most other companies due to their various global locations and independent operations. Hence, technology not only has to link but also has to bring them together culturally. This implies: pushing the computing power to the periphery (everyone is online and empowered); leveraging professionals with technology (no secretaries); running the entire system with electronic file cabinets (no paper memos); making any document available anywhere, anytime (any employee can work anywhere, anytime); tracking all employees electronically (24 hours/day integrated global corporate community); and integrating data, voice, graphics, audio, video communication via appropriate technologies (lowest cost and highest value options).
We have focused on five important frontier areas leading to the concept of virtual organizations. The research and innovations that will emerge in these and other frontiers will ultimately enable Marshall McLuhan's vision of the global village. While several issues arise in these contexts, we conclude by focusing on one key issue as follows: How can industry and academia work together to advance the emerging vistas of IS/IT? We address this question from two perspectives: the governmental funding agencies and the industrial R&D sector.
Funding agencies' perspective. This is a perpetual question that is continuously encountered at the National Science Foundation and the Army Research Labs. Clearly, there are R&D initiatives that are valuable to industries, which they cannot sponsor for lack of resources. Research initiatives involving substantial teamwork, cross-functional efforts and long term involvement are examples of such ventures. Such initiatives are ideally suited to the academic arena, and are best handled at the universities. A classic example is the five-year project called the upper atmosphere research collaboratory project at the University of Michigan. This project entails a large group of atmospheric scientists, computer scientists, and behavioral scientists studying space and weather together, leading to an integration of highly heterogeneous scientific contributions. Such multidisciplinary ventures leading to state-of-the-art developments clearly are possible from the academia, and are logically more suited to such organizations with the support of the funding agencies. Obviously, there are significant synergies from industry/academia collaborations, as well as a symbiotic coexistence that can be derived from independent and parallel ventures in the two communities. The role of the government and other funding agencies is to ensure that the developments on the two fronts are balanced as well as mutually supportive.
Industrial R&D sector's perspective. The main advantage that industry has, in contrast to academia, is that real day-to-day problems are encountered and practical and cost-effective solutions must be designed. Compared to this, academia is largely dependent on funding agencies as well as industry to bring its innovative ideas to fruition, in spite of its tremendous intellectual capital. While the resource limitations are equally active in both academia and industry, the driving force of the real needs enables industries to find the resources required to solve their problems. Since such a driving force is almost absent in academia, a major part of academic research tends to have a long-term orientation, while the industrial counterparts have to deal with near- and medium-term problems. Consequently, academic ventures tend to be abstract and theoretical, and the industries may not derive sufficient motivation to pursue them due to the distance toward their full, practical realization. The perceived gap between the developments in the academic and industrial worlds is largely due to this differential focus and approach. Nevertheless, partnerships between the academia and the industries are also flourishing in certain fields, especially in semiconductor research. Universities and other academic institutions play a remarkably complementary role in the leading developments that occur in this field. Although the companies in this area are equally pressed about their balance sheets and next quarter earnings as in other industries, they also experience the need to assemble diverse talents to solve their short- and medium-term problems. Given their financial resources and academia's intellectual resources, an ideal complementary system of industry/academia consortiums has evolved in this field, leading to some of the leading-edge technological developments. In particular, the university academic labs, government labs, and industrial testbeds can collaborate and play key research roles. Greater interaction and collaboration among these various players is very much neededfailing this, the gap between the directions of these communities is only likely to widen.
1. Chandra, J., Gasser, L., March, S., Mukherjee, S., Pape, W., Ramesh, R., Rao, H. R. and Waddoups, R. Information Systems Frontiers: Emerging VistasParts I and II. Working Papers No. 950 and 951, School of Management, State University of New York at Buffalo, 1998.
2. Claasen, T. Information Systems Frontiers: Emerging Vistas. Philips Laboratories Technical Report. Philips Semiconductors, Eindhoven, The Netherlands, 1997.
3. Gamma, E., Helm, R., Johnson, R., and Vlissides, J. Design Patterns: Elements of Reusable Object-Oriented Software. Addison Wesley, Reading, Mass., 1995.
4. Gasser, L. Knowledge networks. Panel presentation on Information Systems Frontiers, International Conference on Information Systems (ICIS) (Atlanta, GA, Dec. 1417, 1997).
5. Guttman, M. and Matthews, J. The Object Technology Revolution. Wiley, NY, 1995.
This article is based on a panel discussion conducted at the 18th International Conference on Information Systems (ICIS), held in Atlanta, GA in late 1997; the Robert F. Berner Management Science and Systems Excellence Fund, SUNY at Buffalo, supported organizing this panel.
©2000 ACM 0002-0782/00/0100 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2002 ACM, Inc.
No entries found