acm-header
Sign In

Communications of the ACM

Viewpoint

Does the Internet Make Us Stupid?


Does the Internet Make Us Stupid? illustration

Credit: Andrij Borys Associates / Shutterstock

According to Farhood Manjoo in Slate magazine, almost nobody finishes reading papers online.5 Jakob Nielsen has shown that superficial reading carries over from screens to printed material.a Bauerlein,1 Brabazon,2 Carr,4 and others have argued convincingly that the Internet and other information and communication technologies (ICTs) are changing our reading habits to "skimming" rather than careful reading. Putting this together, I am concerned you will not read this Viewpoint carefully to the end. Hence, I better present my conclusions right away: ICTs are indeed reducing many of our cherished cognitive facilities, much as our physical fitness has been reduced by all kinds of machinery for physical work and locomotion. However, in my opinion, this is not too bad, as long as our reduced facilities are overcompensated by appropriate technology, and provided we make sure of two things: that we are not completely lost in case of large-scale breakdowns of technology, and that use of ICTs does not endanger our creativity. Both provisos are starting to receive attention: the first will hopefully be solved by introducing systems with sufficient redundancy; the second attracts varying opinions: some, like Carr, see mainly dangers, others like Thompson,7 see our future in a growing man-machine symbiosis. My own opinion is that creativity is not endangered if the new technologies are used with some caution.

If you stop reading here, you have read the important part of the message. If you continue reading, I hope I can drive home the message with emotional emphasis.

Over the last six years or so, numerous papers and books have claimed the Internet and related technologies are reducing our cognitive abilities. Here are some better-known examples and quotes.

Brabazon writes: Looking at schools and universities, it is difficult to pinpoint when education, teaching and learning started to haemorrhage purpose, aspiration and function. As the Internet offers a glut of information, bored surfers fill their cursors and minds with irrelevancies, losing the capacity to sift, discard and judge.2 Brabazon is particularly worried by evidence she collected herself that reading with understanding and creative writing is markedly reduced in students who use ICTs intensively, and that concentrated thinking and attention spans appear much reduced. This is echoed by many later publications and books, including the ones discussed here.

The title of Bauerlein's book The Dumbest Generation1 and its subtitle How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (Or, Don't Trust Anyone Under 30) clearly indicate he is thinking along the same lines. Bauerlein was probably also the first to diagnose that ICTs are increasing the generation gap, since young people, in their effort to be "in," learn increasingly more from peers than from adults.

Carr states very strongly that new technologies including the Net fragment contents and disrupt our concentration.4 He emphasizes one important aspect that has been recognized by neuroscience and brain science for some time: use of new media like the Net creates new habits and changes the brain: the plasticity of the brain can work against us by reinforcing certain behavior. Carr gives many examples that the way people read and write has already been changed by the Net that ... contributes to the ecosystem of interruption technologies," even causing newspapers to change their style, by turning to shorter stories, or such. He quotes the neuro-scientist Michael Merzenich, professor emeritus at the University of California in San Francisco, as saying he was profoundly worried about the cognitive consequences of the constant distractions and interruptions the Net bombards us with. The long-term effect on the quality of our intellectual lives could be "deadly."


Creativity is not endangered if the new technologies are used with some caution.


Weber's book, The Google-Copy-Paste Syndrome, with statements such as: The real danger is not that plagiarism is used fraudulently for personal gain, but that the copy/paste mentality destroys thinking (translated from German by the author) emphasizes another aspect.

The title of Spitzer's book Digital Dementia: How We Drive Ourselves and Our Children Mad (translated from German by the author) says it all, even if you do not read to the point where he claims: Computers for teaching subjects like history are as important as bicycles for teaching how to swim (translated from German by the author).

Back to Top

Why So Negative?

The previously mentioned authors share a number of concerns: We are so inundated by information that our attention span has become very small. We are heading (my terminology) toward a global attention deficit syndrome: we cannot be inactive any more, yet we are losing the power to concentrate. If we are not drowned by email, by text messages, or tweets telling us to look at some site or YouTube clip, or browsing our friends' updates on a social network, listening to mp3 music, zapping through 100 TV shows, answering the phone, or calling someone, we somehow must find another way to occupy ourselves. Many people waste so much time keeping their "friends" in the network happy they have little time left for productive work. They have externalized much of their knowledge into the cloud and their smartphones. They do not need to remember many facts, threatening the functioning of their memory. Students do not write essays these days. Rather, using Google search, copy, and paste, they glue pieces of information together, hardly understanding what they are producing. They are no longer able to read complicated texts, spoiled by bite-sized pieces of information contained in tweets or in text messages, often written in a new shorthand: "sms r good 4u." A blind trust that e-learning can replace good teachers often leads to less-educated children; apps are substituting thinking and cognitive capacity is shrinking.

These are some of the concerns often mentioned. If they were simply concerns, we could brush them off. However, almost all of them are based on solid quantitative research and experiments. Hence, we must take them seriously. The technologies involved do indeed make us, or upcoming generations, more stupid, measured by the cognitive strength of brains.

Back to Top

An Unpleasant Surprise

All of us working in computer science have realized ICT helps us with very good access to information, research papers, and communication with colleagues, ensuring we do not end up in cul-de-sacs and are exposed to new ideas rapidly. ICT is above all a powerful and positive force in many areas like medicine, transportation, and production, just to name a few. It has also made many mundane tasks easier, such as booking a hotel, an event, or a trip. But it has also produced serious problems of privacy, of indirect control over us by others, of increased violence through violent games (as Bushman states convincingly3). And it is possibly creating new kinds of warfare, and yes, in a few aspects it has made us lazier.

Or maybe it has just relieved us of some tasks to create room for new challenges? Some superficial arguments concerning cognitive tasks might be: Why should I do complicated calculations when my smartphone has a built-in calculator? Why worry about counting change, if I pay by credit card anyway? Why bother about spelling mistakes when my spell-checker makes fewer mistakes than my teachers did? Why remember phone numbers when my smartphone has speech activation? I do not worry if I cannot find my phone at home: I use my wife's phone and the ringing of my phone makes sure I can locate it. Too bad my shoes do not ring yet, but soon NFC devices will help me find them, or anything else of interest to me, for that matter. Handwriting—what the heck! I dictate most things these days, or else use a keyboard. With the language app on my phone, I can converse on a simple level in any language of the world. I forgot where I found the app, but I am sure you will be able to locate it. Even before I used the English-Japanese language app, it was easy for me to order in a Japanese restaurant: they all have plastic replicas of the meals in the window, so I just took a picture with my digital camera and showed it to the waiter. I find it convenient when hiking that I no longer have to memorize details of a map. What does it matter that my sense of orientation may have deteriorated, my smartphone can find any place in any city or on any hike with a few taps.

In addition, there are all the benefits from using ICT in important applications like medicine, transportation, production, and so forth, as previously described.

And now I am supposed to believe all those great achievements come at the price of increasing stupidity! Does this mean we technologists will soon have to ask ourselves the same question physicists had to ask themselves in connection with nuclear weapons: Do we contribute positively to mankind or do we threaten it, because we are reducing the capacity of humans for deep logical thinking?

Back to Top

Do We Need to Worry?

One can argue there is no need to worry if some of our cognitive facilities are reduced due to technology, as long as the loss is overcompensated by technology and as long as we can assure two crucial points: independent and creative thinking must not be threatened and we must still be able to function in a reasonable "basic mode" if technologies fail. The difficulty is that we do not know at this point how much knowledge we can "outsource" into the Net and computers without reducing creativity. Also, our infrastructure is inadequate for a massive breakdown of the Net or the electric grid. The first issue requires serious research in neurosciences, the second research and attention by engineers in a number of disciplines to provide enough redundancy. Unfortunately, providing redundancy will have its cost, hence will meet resistance.


The difficulty is that we do not know at this point how much knowledge we can "outsource" into the Net and computers without reducing creativity.


We must stop looking at humans as naturally, biological grown beings. Rather, we must understand ourselves as organic beings in symbiosis with technology. I myself am a good example. I am middle-ear deaf. That is, without very special hearing aids I would not hear a thing. I wear eyeglasses or I would see everything blurred. My pacemaker keeps my heart beating properly. And the metal plate in my replacement hip is perfect; well, when I go through airport security, I sometimes have to show a medical statement about that piece of metal. If you were to take away all this technology, I would be physically impaired at best, but probably dead. As is, I can hike, scuba dive, go to concerts, do research, and even make it into Communications once in 20 years.

In other words, we should not judge persons now and in the future without the technological tools they are using, whether those tools are built-in (pacemaker) or external (hearing aid, smartphone, reading software for visually challenged persons, tablet PCs, Google Glass). We have long accepted this for physical properties: my grandfather was strong: he could carry 50kg 20 kilometres in four hours! Well, I can do better: I can carry 250kg 200 kilometres in two hours with my car. If I were to encounter an adversary, I would still prefer it to be an unarmed body-builder rather than someone skinny with a machine gun. What is happening now is that technology is starting to also replace some cognitive functions, reducing our very own capabilities like our memory or orientation facilities. This raises an important concern: Does our increasing dependency on technology sabotage our ability to think for ourselves? Surely, looking up some facts is not the same as coherent logical thinking. Is the latter in danger? The answer is frustrating: none of us knows. As an optimist I hope we can make good use of the possibility offered to us by easy access to high-quality research and arguments and discussion with colleagues, without losing our power of thinking.

Despite my positive attitude, I am aware of the two important aspects hinted at before. The first is that if we rely on technologies, we should also ensure we have a backup when those technologies fail. I do not think this issue has been taken seriously enough in the past: we should be more careful to have solutions if, for example, electricity, transportation systems, water supply, or other services fail for an extended period over a large area. With ICT's enormous influence on our lives and cognitive capabilities, this issue is becoming more pressing.


We must understand ourselves as organic beings in symbiosis with technology.


Secondly, repeated actions change our brain and hence how we think. As such, this is nothing new: physical work also changes how we act, since our muscles get stronger. Yet the danger that creativity is threatened because we "outsource" too much into the Net or delegate it to computer algorithms is real: we must not empty our brain or it seems likely we might lose the capability to think clearly and bring together important facts: no links in the Net can do this for us. Knowing how to navigate the Net does not make up for synapses generated in our brain. We will be using algorithms doing some job for us (like calculating something, speech translations, finding a route, and other functions). Yet it is clear some basic information and the facility to do serious logical thinking must not disappear. Whether logical thinking can be best achieved by learning mathematics, or by some other means like learning how to play chess or bridge (might be more fun) will still have to be determined. To find good answers requires more neuroscience and brain research: what capabilities do we need in our brains to remain creative? What do we lose and what do we gain, if instead of retaining all the details of one book on a particular topic, we retain a few details of many books with different views on that topic in our brain? At this point in time nobody seems to have valid answers, hence this is an important and crucial research topic but more for neuroscience than computer science. When these important questions have been answered we will know whether we need a personal trainer for our minds as we need one for our bodies!

Back to Top

Conclusion

There is no doubt ICT has had a positive influence on many aspects of our life. Looking specifically at thinking and at the discovery of new results and doing research, it is clearly positive that we can easily access new research results and communicate worldwide as a basis for own imaginative thinking, possibly added by the ability to deal with foreign languages or getting better insights through new kinds of visualization. However, the Net (and other ICTs) will also "make us more stupid" as far as some of our cognitive capabilities are concerned. This should be accepted by us if two important points are not forgotten: that we must not be completely dependent on technology and that we must retain the capability for logical thinking and creativity. Only then can we judge the balance of what we gain against what we lose.

Before we understand the full impact of ICT on our brain and our thinking, caution is essential. Thus, major challenges for further research and the study of behavioral patterns are to find out what we can outsource and what we better retain in our own brains.

Back to Top

References

1. Bauerlein, M. The Dumbest Generation. Tarcher, 2008.

2. Brabazon, T. The University of Google. Ashgate, 2007.

3. Bushman, B.J. The Effects of Violent Video Games. Do They Affect Our Behavior? (2012); http://ithp.org/articles/violentvideogames.html.

4. Carr, N. The Shallows: What the Internet Is Doing to Our Brains, W.W. Norton, 2010.

5. Manjoo, F. You Won't Finish This Article. Slate Magazine (2013); http://slate.com/articles/technology/technology/2013/06/how_people_read_online_why_you_won_t_finish_this_article.single.html

6. Spitzer, M. Digitale Demenz, Wie wir uns und unsere Kinder um den Verstand bringen. Droemer Verlag [in German], 2012.

7. Thompson, C. Smarter Than You Think. How Technology is Changing Our Minds to the Better. Collins, 2013.

8. Weber, S. Das Google-Copy-Paste-Syndrom, dpunkt Verlag [in German], 2008.

Back to Top

Author

Hermann Maurer ([email protected]) is emeritus professor of Computer Science at Graz University of Technology, Austria, and a member of the Board of Academia Europaea.

Back to Top

Footnotes

a. See http://www.nngroup.com/articles/how-users-read-on-the-web/.

The author thanks Keith Andrews and the anonymous reviewers for their valuable input.


Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2015 ACM, Inc.


Comments


Alex Potanin

The 2012 study on violent video games causing violence linked in the article starts with a rebuttal from 2014 showing that they actually don't. Better to check the links first before citing them I guess - or could it be the inability to concentrate and doing things in a rush showing through? :-)


Atanas Radenski

An interesting and thought-provoking article, thank you. Perhaps, one can agree that "creativity is not endangered if the new technologies are used with some caution". The problem is (and I can be wrong, of course) that net technologies are driven to a large degree by market forces. Therefore, anything that impedes profits, such as 'applying some caution in the use of new technologies' can be difficult to put in practice.


Vincent Van Den Berghe

(please note: in the text below Im using the fact that Mr. Maurer mentions having a hearing aid and a pacemaker to make a point, not to discriminate or disparage him in any way. Bear with me)

Yes indeed, technology has a positive influence on many aspects of our life. I disagree that its OK if it makes us dumber, as long as we dont overly rely on it, and we retain our capacity for logical thinking and creativity.
This seems to be based on the premise that the ICTs we use are really giving us a choice about how we use them. This may not be true.

Here is something you dont know about hearing aids: they are really very smart connected Android devices.

First, a modern hearing aid will automatically filter offensive sounds. Expletives, curses and other offensive stuff is carefully edited out of the sound stream. Illegal sounds (like spoken instructions on how to bypass its filtering system) is suppressed. What gets filtered out is determined by the manufacturer and its business associates, but you dont mind since you dont know whats missing. You trust that their definition of offensive matches yours so after all, what could possibly be the added value of something you dont want to hear anyway?

Second, based on your musical preferences and buying behavior deduced from your Google account (which you needed to provide when you bought the device), certain sounds will be selectively amplified and moved to the foreground at the expense of other sounds. You will hear better what you wanted to hear in the first place, which will increase your happiness. To increase it even more, certain sounds that are not really your cup of tea (but are promoted through advertising deals with the hearing aid manufacturer) will be embellished so that they resemble more what you want to hear. When the ad campaign ends, youll probably ask yourself why you bought so much stuff you dont care about now. It sounded so good at the time!

Its possible to upgrade the hearing aid software for better sound processing. Hearing characteristics change with age, and recent software will compensate for that. So you will want to upgrade at some point.
The last upgrade (Android 7.0 Spector edition) may record and upload samples of what you hear, needs a connection to your pacemaker (if available) or will use the hearing aids microphone to monitor your heartbeat. The correlation between what you hear and how fast your heart beats is automatically collected, and this information may be used to deduce which sounds turn you on or off. There may be a couple of interesting health-related applications, but its essentially deducing a part of your emotional behavior through correlation of physical parameters, which is interesting in its own right. Imagine the potential for security related scenarios (gunfire sound + elevated heartbeat = trouble), and dynamic ad displays that use one of Pharell Williams happy sounds that cheers you up when youre in the bathroom on Monday morning.

Of course, technology respect your choices. Before you upgrade, you will be clearly notified what physical parameters will be monitored. You can chose to accept it, or not. If you dont, you will be stuck with your old Android 6.0 Wall of sound edition). But its nice to know you have a choice. The newer hearing aids will come with the Spector edition preinstalled, so thats one less thing to worry about.

Everything I wrote about hearing aids is a lie. No, they are not Android devices, and they dont to all these things. The sounds that are amplified by a hearing device are real sounds. It allows people who use them to perceive existing reality better. Any manipulation of that reality would be unacceptable, right?

And yet:
- When I search something, the results provided by Google will be in the order determined by them, based on my past behavior and the behavior of people like me. Certain results that are viewed as offensive or illegal are filtered out. And I dont know whats missing.
- When I read a book on Amazon my behavior is tracked: which books I read, how long I spent reading them, which section I like the most, which passages I highlight. This is used to recommend to me what books I may also find interesting, and to recommend to the publishers which books they need to write. Books that become controversial as a result of complaints are automatically removed from my reading device. Fortunately, I can freely chose to read another book, which hopefully is less controversial.

Distorted results, forced upgrade, tracked behavior, walled gardens, false choices. It seems that everyone is wearing Android hearing aids or using similar devices that shape our worldview in accordance with characteristics determined by someone else! With the exception of some crackpots, nobody minds.

What does that have to do with creativity? Everything! Creativity is about reflecting on the world, and finding solutions that make it better. This requires a world view that challenges us, confronts us with things that drags us out of our comfort zone and forces us to face the difficult problems. However, our worldview is increasingly perceived through technology, that filters and distorts the view at an unprecedented level nowadays. Whole generations use this technology without giving it a second thought. Yes, you can be creative even though the ICTs present a filtered world view. Yes, you can be creative in a walled garden.
But if what you can do is limited more by the walls of the garden than your imagination, people should start asking questions and care about the answers. Sadly, that doesnt happen.

What we need isnt more technology or caution. What we need is more control over the ICTs in our life. It is, after all, our life.
If we cant have more control (and Im afraid its too late to reclaim it), we need to upgrade ourselves by to be mindful users of technology, instead of the mindless consumers we are now. If we outsource the more mundane tasks to ICTs, wed better make sure that we can rely on their behavior to be determined by us, not by someone else. The balance of forces should strive to an equilibrium. More is better is not conductive to achieving such balance.

Id rather try to be a music composer whos half deaf, than use a hearing aid as described above. But thats just me.


Displaying all 3 comments