acm-header
Sign In

Communications of the ACM

Viewpoint

Could Artificial Intelligence Create an Unemployment Crisis?


Could Artificial Intelligence Create an Unemployment Crisis?, illustration

Credit: Accent / Shutterstock.com

There is an often-told story about the libertarian economist Milton Friedman. While visiting a large-scale public works project in a developing Asian nation, Friedman asked a government official why he did not see much heavy earth-moving equipment in use; instead, there were large numbers of workers with shovels. The official explained that the project was intended as a jobs program. Friedman replied with his famous and caustic question: "So why not give the workers spoons instead of shovels?"

That story is a pretty good indication of the almost reflexive derision that is likely to arise in response to any serious speculation about the possibility that advancing technology could destroy jobs and cause long-term structural unemployment. Nonetheless, I think there are good reasons to be concerned that advances in artificial intelligence and robotics are rapidly pushing us toward an inflection point where the historical correlation between technological progress and broad-based prosperity is likely to break down—unless our economic system is adapted to the new reality.

Why should the implications of today's accelerating information technology be different from the innovations of the past? I believe the answer lies in the nature of the transition that will be required for the majority of the workforce to adapt and remain relevant.

Most of the work required by the economy is—on some level—fundamentally routine in nature. By this, I do not mean the work is rote repetitive, but rather that it can be broken down into a series of discrete tasks that are relatively predictable and tend to get repeated over some time frame. The percentage of people who are paid primarily to engage in truly creative or non-routine occupations is fairly small. This has always been the case, and the repetitive nature of most jobs has historically been a good match with the capabilities of the average worker.

Technology has, of course, often disrupted and even destroyed whole industries and employment sectors. In the U.S., the mechanization of agriculture vaporized millions of jobs and led workers to eventually move from farms to factories. Later, manufacturing automation and globalization caused the transition to a service economy. Workers repeatedly adapted by acquiring new skills and migrating to jobs in new industries—but these changes have not altered the fact that most jobs continue to be essentially routine.

In the past, disruptive innovations have tended to be relatively specialized and to impact on a sector-by-sector basis. Workers have responded by moving from routine jobs in one area to routine jobs in another. Today's information technology, in contrast, has far more broad-based implications: it is transforming and disrupting every sector of the economy. For the first time in history, computers and machines are increasingly taking on intellectual tasks that were once the exclusive province on the human brain. Information technology will continue to accelerate, and it is certain to be tightly integrated into any new industries that arise in the future.

The impact of information technology on the job market, and in particular on more routine jobs, has been well documented.2,3 Economist David Autor of MIT, in particular, has done extensive analysis showing that the job market in the U.S. has become polarized.1 A substantial fraction of moderate wage, routine jobs in areas like manufacturing and white-collar clerical occupations have been eliminated by technology, leaving the remaining employment opportunities clustered at the top (high-wage/high-education jobs) and at the bottom (low-wage jobs requiring little education).

While economists have noted the correlation between whether or not a job is routine and its susceptibility to automation, I do not think they have yet fully acknowledged the future impact that accelerating progress is likely to have. Our definition of what constitutes a "routine" job is by no means static. At one time, the jobs at risk from automation were largely confined to the assembly line. The triumph of IBM's Watson computer on the television game show "Jeopardy!" is a good illustration of how fast the frontier is moving. I suspect very few people would characterize playing "Jeopardy!" at a championship level as routine or repetitive work, and yet a machine was able to prevail.

Machine learning, one of the primary techniques used in the development of IBM's Watson, is in essence a way to use statistical analysis of historical data to transform seemingly non-routine tasks into routine operations that can be computerized. As progress continues, it seems certain that more and more jobs and tasks will move from the "non-routine" column to the "routine" column, and as a result, an ever-increasing share of work will become susceptible to automation.


It is important to realize technology does not have to cause immediate job destruction in order to create significant future unemployment.


This goes to the heart of why the historical record many not be predictive with regard to technological unemployment. In order to remain essential to the production process, workers will have to make a historically unprecedented transition. Rather than simply acquiring new skills and moving to another routine job, workers will have to instead migrate to an occupation that is genuinely non-routine and therefore protected from automation—and they may have to do this rapidly and repeatedly in order to remain ahead of the advancing frontier.

There are good reasons to be pessimistic about the ability of most of our workforce to accomplish this. If we assume, as seems reasonable, a normal distribution of capability among workers, then 50% of the workforce is by definition average or below average. For many of these people, a transition to creative/non-routine occupations may be especially challenging, even if we assume that an adequate number of such jobs will be available.

Both the high and low ends of our polarized job market are likely to come under attack as technology advances. Higher-wage white-collar jobs will be increasingly susceptible to software automation and machine learning. One of the biggest drivers of progress in this area is likely to be the "big data" phenomenon and the accompanying emphasis on algorithmic techniques that can leverage the enormous quantities of data being collected.

Much of the initial focus has been on how big data can be used to give organizations a competitive advantage in terms of marketing and customer relationships. However, corporations are certainly also collecting huge amounts of internal information about the work being done by employees and about their interactions with customers—potentially creating a rich dataset that future machine learning algorithms might churn through.

The impact is already being felt in a number of professions. Lawyers and paralegals have been displaced by e-discovery software that can rapidly determine which electronic documents are relevant to court cases. More routine forms of journalism—such as basic sports and business writing—have been successfully automated. Entry-level positions are especially vulnerable, and this may have something to do with the fact that wages for new college graduates have actually been declining over the past decade, while up to 50% of new graduates are forced to take jobs that do not require a college degree.5

The polarized nature of the job market means workers who fail to find and retain one of the high-end jobs face a long fall. The lower-end jobs are heavily weighted toward hourly service positions with minimal wages and few benefits. These, often part-time, jobs in areas like retail, fast food, and full-service restaurants, have traditionally offered a kind of income safety net for workers with few other options.

Yet there are good reasons to expect that even these lower-range jobs may soon come under significant pressure from technology. For example, it is easy to envision increased automation taking hold in the fast food and beverage industry. From a technical standpoint, fast food is not really a service industry at all: it is, rather, a form of just-in-time manufacturing, and there is no good reason to believe it will be forever exempt from the advances that are transforming other manufacturing sectors.

Retail jobs are also likely to be impacted. Self-service checkout lanes are becoming increasingly prevalent and popular. Mobile applications offer in-store access to product information and customer service. Wal-Mart is currently testing a service that allows customers to scan barcodes and then pay for their purchases with their mobile phones—completely avoiding lines and cashiers.

Brick-and-mortar retailers will also continue to be disrupted by online competitors like Amazon, especially as Internet retailers offer faster delivery options and as customers increasingly use mobile technology to look for lower prices online. In theory, this should not destroy jobs but simply transition them from traditional retail settings to warehouses and distribution centers. However, once jobs move to a warehouse environment, they seem likely to be more susceptible to automation. Amazon's purchase of Kiva Systems—a company that focuses on warehouse robotics—is probably indicative of the trend in this area.

Many low-wage jobs have been protected from automation primarily because human beings are extremely good at tasks requiring mobility, dexterity, and hand-eye coordination, but these advantages are certain to diminish over time. Robots are rapidly advancing while becoming less expensive, safer, and more flexible, and it is reasonable to expect they will have a potentially dramatic impact on low-wage service sector employment at some point in the not too distant future.

It is important to realize technology does not have to cause immediate job destruction in order to create significant future unemployment. The U.S. economy needs to generate in excess of 100,000 new jobs per month just to keep up with population growth. As a result, anything that significantly slows the rate of ongoing job creation could have a significant impact over the long term. Because workers are also consumers, entrenched technological unemployment would be very likely to depress consumer spending and confidence—thereby spawning a wave of secondary job losses that would affect even occupations not directly susceptible to automation.4

I suspect the impact of accelerating technology on the job market may ultimately represent a dramatic and vastly under-acknowledged challenge for both our economy and society. Many extremely difficult issues would arise, including finding ways for people to occupy their time and remain productive in a world where work was becoming less available and less essential. The biggest immediate challenge, however, would be one of income distribution: how will people without jobs and incomes support themselves, and how will they be able to participate in the market and help drive the broad-based consumer demand that it vital to sustained economic prosperity and innovation?

Finally, it is worth noting everything I have suggested here might be thought of as the "weak case" for technological disruption of the job market. I have presumed only that narrow, specialized forms of machine intelligence will increasing eliminate more routine jobs. None of these technologies would be generally intelligent or could pass a Turing test. Yet, the more speculative possibility of strong AI cannot be completely discounted. If, someday, machines can match or even exceed the ability of a human being to think and to conceive new ideas—while at the same time enjoying all the advantages of a computer in areas like computational speed and data access—then it becomes somewhat difficult to imagine just what jobs might be left for even the most capable human workers.

Back to Top

References

1. Autor, D.H., Katz, L.F., and Kearney, M.S. The polarization of the U.S. labor market. American Economic Review 96, 2 (May 2006), 189–194.

2. Autor, D.H., Levy, P., and Murnane, R.J. The skill content of recent technological change: An empirical investigation. Quarterly Journal of Economics 118, 4 (Nov. 2003), 1279–1333.

3. Brynjolfsson, E. and McAfee, A. Race Against the Machine: How the Digital Revolution is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy. Digital frontier Press, 2011.

4. Ford, M. The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future. Acculant Publishing, 2009.

5. Hagerty, J.R. Young adults see their pay decline. The Wall Street Journal (Mar. 9, 2012); http://online.wsj.com/article/SB10001424052970204276304577265510046126438.html.

Back to Top

Author

Martin Ford ([email protected]) is a software developer, entrepreneur, and author of the book The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, which focuses on the economic impact of artificial intelligence and robotics. He has a blog at http://econfuture.wordpress.com.


Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2013 ACM, Inc.

The full text of this article is premium content


Comments


CACM Administrator

The following letter was published in the Letters to the Editor (pp. 8-9) in the October 2013 CACM (http://cacm.acm.org/magazines/2013/10/168182).
--CACM Administrator

In his Viewpoint "Could Artificial Intelligence Create an Unemployment Crisis?" (July 2013), Martin Ford repeatedly assumed "Information technology will continue to accelerate..." But nothing accelerates indefinitely, and many technologies, including transportation and space flight, have not seen accelerated progress in decades.1 It is quite possible that progress in information technology will likewise reach a plateau of incremental improvement. Ford's apocalyptic vision may come about but should not be based on assumptions for which there is no evidence.

Moti Ben-Ari
Rehovot, Israel

---------------------------------------------

AUTHOR'S RESPONSE

As Ben-Ari says, acceleration does eventually slow down and it is reasonable to assume IT will likewise experience such deceleration. However, it is also not likely to occur soon. Even if advances in hardware (per Moore's Law) were to plateau, progress could continue to accelerate along other fronts (such as software performance, parallel computing, and new architectural breakthroughs). Even if the doubling period for IT acceleration would lengthen significantly, it would still imply rapid progress in light of performance levels already achieved.

Martin Ford
Sunnyvale, CA


Displaying comments 11 - 11 of 11 in total

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account