We like to think we have been surfing a tsunami of computing innovation over the past 70 years: mainframe computers, microprocessors, personal computers, the Internet, the World Wide Web, search, cloud computing, social media, smartphones, tablets, big data, and the like. The list goes on and on, and the future for continuing innovation is quite bright, according to the conventional wisdom.
Recently, however, several people have been questioning this techno-optimism. In a commencement address at Bard College at Simon's Rock, U.S. Federal Reserve chair Ben Bernanke compared life today to his life as a young boy in 1963, and his grandparents' lives in 1913. He argued that the period from 1913 to 1963 saw a dramatic improvement in the quality of daily life, driven by automobilization, electrification, sanitation, air travel, and mass communication. In contrast, life today does not seem that different than life in 1963, other than the fact we talk less to each other, and communicate more via email, text, and social postings.
In fact, the techno-pessimists argue the economic malaise that we seem unable to pull ourselves out ofthe sluggish economic growth in the U.S., the rolling debt crisis in Europe, and the slowdown of the BRICSis not just a result of the financial crisis but also an indication of an innovation deficit. Tyler Cowen has written about the "great stagnation," arguing we have reached a historical technological plateau and the factors that drove economic growth since the start of the Industrial Revolution are mostly spent. Robert Gordon contrasted the 2.33% annual productivity growth during 1891-1972 to the 1.55% growth rate during 1972-2012. Garry Kasparov argued that most of the science underlying modern computing was already settled in the 1970s.
The techno-optimists dismiss this pessimism. Andrew McAfee argued that new technologies take decades to achieve deep impact. The technologies of the Second Industrial Revolution (1875-1900) took almost a century to fully spread through the economies of the developed world, and have yet to become ubiquitous in the developing world. In fact, many predict we are on the cusp of the "Third Industrial Revolution." The Economist published a special report last year that described how digitization of manufacturing will transform the way goods are made and change the job market in a profound way.
So which way is it? Have we reached a plateau of innovation, dooming us to several decades of sluggish growth, or are we on the cusp of a new industrial revolution, with the promise of dramatic changes, analogous to those that took place in the first half of the 20th century?
From my perch as the editor-in-chief of Communications of the ACM, I find it practically impossible to be a pessimist (which is my natural inclination). The flow of exciting research and news articles that we publish monthly continues to be innovative and exciting. No stagnation here!
Last year ACM celebrated Alan Turing's centenary by assembling a historic gathering of almost all of the living ACM A.M. Turing Award Laureates for a two-day event in San Francisco. Over 1,000 participants attended the meeting and the buzz was incredible. Participants I talked to told me this was one of the most moving scientific meetings they have ever attended. We were celebrating not only Turing's centenary, but also 75 years of computing technology that has changed the world, as well as the people who pioneered that technology. Like McAfee, it is difficult for me to imagine this technology not broadening and deepening its impact on our lives.
Earlier this year the McKinsey Global Institute issued a report on "Disruptive Technologies: Advances that will transform life, business, and the global economy," in which they assert that many emerging technologies "truly do have the potential to disrupt the status quo, alter the way people live and work, and rearrange value pools." By 2025, the report predicted a $5-$7 trillion potential economic impact from automation of knowledge work and the prevention of 1.5 million driver-caused deaths from car accidents by automation of driving. The 12 potential economically disruptive technologies listed in the report are: mobile Internet; knowledge-work automation; the Internet of Things; cloud technology; advanced robotics; autonomous and near-autonomous vehicles; next-generation genomics; energy storage; 3D printing; advanced materials; advanced oil and gas exploration and recovery; and renewable energy.
"Predictions are difficult," goes the saying, "especially about the future." It will be about 25 years before we know who is right, the techno-pessimists or the techno-optimists. For now, however, count me an optimist, for a change!
Moshe Y. Vardi, EDITOR-IN-CHIEF
©2013 ACM 0001-0782/13/09
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2013 ACM, Inc.
I wonder how much innovation, and subsequent profit from innovation, has been hampered or restricted by the rent seeking actions of existing entities who use government to squash competition and advancement of technology. This is not to say such suppression of advancement did not occur in the past, but it certainly seems like we face more forces acting against change now than ever before.
I'd say the barrier to entry in these fields is much higher today. The five areas mentioned are all areas in which relative newcomers could make significant contributions a century ago, but they're also areas that I would loathe to try innovating in today. There's just way too much competition and regulation and so forth.
The other big difference is: it was not obvious to the layperson 100 years ago that these would be life-changing inventions in the way that they were. Seeing an airplane in 1913, you would not think of a transcontinental trip on a Boeing 777. Seeing an analog computer in 1913, you would not imagine an iPad.
Surely, the great inventions of 2013 are in fields that look like tiny niches today (where experience is not required, as nobody has any) and which won't be truly appreciated for 50 or 100 more years.
The following letter was published in the Letters to the Editor of the December 2013 CACM (http://cacm.acm.org/magazines/2013/12/169937).
-- CACM Administrator
Moshe Y. Vardi's editorial "Has the Innovation Cup Run Dry?" (Sept. 2013) offered two divergent conceptions of innovation: In one, drawn from a 2013 Bard College commencement address by outgoing U.S. Federal Reserve chair Ben Bernanke (http://www.federalreserve.gov/newsevents/speech/bernanke20130518a.htm), innovations (19131963) were described as having produced ". . . dramatic improvement in the quality of daily life . . ." In the other, a report from the McKinsey Global Institute (http://www.mckinsey.com/insights/business_technology/disruptive_technologies) predicted (for 20132025) the emergence of ". . . technologies [having] the potential to disrupt the status quo, alter the way people live and work, and rearrange value pools."
A good way to view the divergence is to focus on what is changing in the first (quality of daily life) and the direction of that change (dramatic improvement). In the second, how people live and work is changing. The direction of that change (disrupt, alter, and rearrange) is far from unambiguously positive and likely to be viewed negatively by many.
I was thus prompted to explore ideas introduced by the influential Austrian-American economist Joseph Schumpeter (18831950), who used the term "creative destruction" in describing his theory of innovation-driven economic growth. In it, innovation operates within the system of production to cause old inventories, ideas, technologies, skills, and equipment to become obsolete. Replacement benefits consumers who experience improvement in daily life through increased capability, variety, and affordability of products and services. Part of the cost of this improvement is disruption in the world of work, as experienced by workers, managers, business owners, and investors, some benefiting and some forced to endure loss and painful adaptation. Schumpeter's theory thereby contributes to resolving the divergence of conceptions by showing how the effects of innovation once fell unevenly on separate aspects of life outside work and within work.
Concerning those whose work contributes to innovation through technology, this analysis highlights challenges and their related questions, including:
Lost distinction. Why and how did the world of technology lose interest in the important distinction between effect on work and effect on daily life?;
Pride in disruption. Why and how did the world of technology begin to feel pride in disruption as such, usually expressed without comment, along with or in place of pride in improvement of daily life?;
Degraded communication. Have the loss of distinction and pride in disruption contributed to degraded communication between the world of technology and the world at large?;
Diminished effectiveness. Have degraded understanding and communication subtracted from the ability of the world of technology to satisfy human needs in the world at large?; and
Friction. Have degraded understanding and communication contributed to friction between the world of technology and the world at large?
Although I have not conducted an extensive review, I can say the world of technology gives them insufficient attention.
Robert E. Levine
Sierra Vista, AZ
Displaying all 3 comments