acm-header
Sign In

Communications of the ACM

News

Information Theory After Shannon


Purdue University's Wojciech Szpankowski

Wojciech Szpankowski, project leader for the Science of Information Center at Purdue University, is revisiting Claude Shannon's information theory with a team of some 40 researchers.

Credit: Vincent P. Walter

In a sense, Claude Shannon invented the Internet. An electronic engineer at Bell Labs, Shannon developed information theory in "A Mathematical Theory of Communication," a landmark paper published in 1948. By quantifying the limits to which data could be compressed, stored, and transmitted, he paved the way for high-speed communications, file compression, and data transfer, the basis for the Internet, the CD and the DVD, and all that they entail. Now scientists at Purdue University, with a $25 million, five-year grant from the U.S. National Science Foundation, have created the Science of Information Center with the goal of moving beyond Shannon. They aim to develop principles that encompass such concepts as structure, time, space, and semantics. These principles might help design better mobile networks, lead to new insights in biology and neuroscience, drive research in quantum computing, and even aid our understanding of social networks and economic behavior.

"Whether we will build a new theory or not remains to be seen," says Wojciech Szpankowski, professor of computer science at Purdue University and leader of the project, which includes some 40 researchers at nine universities. "It's definitely time, after 60 years, to revisit information theory. It's basically communication and storage today, but we need to go beyond that."

In Shannon's theory, information, which consists of bits, is that which reduces a recipient's statistical uncertainty about what a source transmitted over a communications channel. It allows engineers to define the capacity of both lossless and lossy channels and state the limits to which data can be compressed. Shannon theory ignores the meaning of a message, focusing only on whether the 1s and 0s of binary code are being transmitted accurately. It doesn't care about the physical nature of the channel; information theory is the same for a telegraph wire or a fiber optic cable. And it assumes infinite delay, so a receiver has all the time it needs to receive and recognize a signal.


A growing challenge for information theory is the field of quantum information and quantum computing.


Szpankowski argues that information goes beyond those constraints. Another way to define information is that which increases understanding and that can be measured by whether it helps a recipient to accomplish a goal. At that point, semantic, temporal, and spatial factors come into play. If a person waiting for a train receives a message at 2 P.M. saying the train leaves at 1 P.M., the message contains essentially no information. In mobile networks, the value of information can change over the time it takes to transmit it because the person receiving it has moved; instructions to make a left turn are pointless, for example, if the driver has already passed the intersection. And there's no good way to measure how information evolves on the Web. "We cannot even understand how much information is transmitted on the Internet because we don't understand the temporal aspect of information," Szpankowski says.

Sergio Verdu, an electrical engineer at Princeton University and a co-principal investigator, is trying to find the fundamental limits of compression and data transmission. The math is easy if engineers focus on signal-to-noise ratio, but that turns out to not be the best measure of fidelity in audio and video. The human brain is very tolerant of visual noise in general, but less so of particular types of noise; for instance, sharp edges are much more important to identifying an image than color. It's not just a question of how many bits have to be received to deliver a comprehensible message, but which bits. "In lossy compression, the bottleneck has been to come up with measures of distortion that are relevant in the real world," says Verdu.

It's also difficult to find the limits of compression for structural information, says Szpankowski. Proteins behave differently and transmit different information depending on their shape, and scientists would like to build computer models that would help them better understand how structure contributes to cell development or the rise of diseases. But the math becomes complex, because structural elements such as edges and vertices can add several dimensions to an equation. So far, Szpankowski says, no theory exists to provide a metric for how much information is embedded in structure. There's not even a good way to quantify complexity; we can say a car is more complex than a bicycle, but how much more?

Shannon theory works well for point-to-point transmission or in systems with several transmitters and one receiver. But once there are two transmitters and two receivers, the problem of cross-talk arises, where a receiver can pick up a signal from the wrong transmitter. With the growth in mesh networks and mobile communications, and even high-frequency transmissions turning old copper wires into miniature antennas, the point-to-point solution isn't adequate. "We still don't know what are the ultimate, fundamental limits to how much information we can send robustly in the presence of noise," Verdu says.

Back to Top

Quantum Information

A growing challenge for information theory is the field of quantum information and quantum computing. Classical information has always been understood as a collection of bits, says Madhu Sudan, principal researcher at Microsoft Research New England (now on leave from Massachusetts Institute of Technology) and a co-principal investigator. Quantum information, by contrast, is a continuum; the qubits used in quantum computing can have a value of 1 and 0 and any of the infinite possible values in between simultaneously. In such a circumstance, Sudan asks, "What does it even mean to be information? What does it mean for you to send me information?"

Many differences exist between classical and quantum information, Sudan says, notably that classical information can be copied and quantum information, by definition, cannot. Sudan is not merely interested in discovering fundamental principles; he wants to figure out which ones have practical importance. "The next level is going to focus on 'How do I understand this information? Where did it come from? And how can I manipulate it in ways that are convenient?' " he says.

Researchers are hoping to apply information theory to fields beyond communications and computing. Christopher Sims, an economist at Princeton, applies Shannon's insights about channel capacity to how consumers respond to economic information. In theory, if the central bank alters interest rates or the money supply, prices should change quickly, but in reality they don't. Sims says that's because people have limited physical capacity to process all the data, so they usually don't act on the information until it crosses some threshold, such as appearing on the Yahoo! homepage. They're striking a balance between the cost of processing information and the reduction in uncertainty—the payoff in tracking small interest rate fluctuations isn't worth the effort it takes to react to them. Sims has dubbed this strategy "rational inattention."


"We cannot even understand how much information is transmitted on the Internet," says Wojciech Szpankowski, "because we don't understand the temporal aspect of information."


Sims is hoping to combine information theory with control theory, which looks at the behavior of dynamic systems, and come up with new economic insights. Perhaps, he says, some of those insights will feed back into engineering. "There's a long way to go here, and we've only just begun to get information theorists interested in what we're doing here," Sims says. "We're still using information theory in a relatively primitive way."

"I'm hoping in the first five years we'll make some advances," Szpankowski says. "We will at least formulate the right questions."

* Further Reading

Goldreich, O., Juba, B., and Sudan, M.
A theory of goal-oriented communication, Electronic Colloquium on Computational Complexity TR09-075, Sept. 17, 2009.

Konorski, J. and Szpankowski, W.
What is information? Zeszyty Politechniki Gdanskiej, 5, 2007

Shannon, C.E.
A mathematical theory of communication, The Bell System Technical Journal 27, July and October, 1948.

Sims, C.A.
Rational inattention: a research agenda. Deutsche Bundesbank Spring Conference, Berlin, Germany, May 27, 2005.

Verdu, S.
Fifty years of Shannon theory, IEEE Transactions on Information Theory 44, 6, October 1998.

Back to Top

Author

Neil Savage is a science and technology writer based in Lowell, MA.

Back to Top

Footnotes

DOI: http://doi.acm.org/10.1145/1897816.1897822

Back to Top

Figures

UF1Figure. Wojciech Szpankowski, project leader for the Science of Information Center at Purdue University, is revisiting Claude Shannon's information theory with a team of some 40 researchers.

Back to Top


©2011 ACM  0001-0782/11/0200  $10.00

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2011 ACM, Inc.


Comments


Ioan ANDONE

Yes, I agree with N. Savage because we must research more the information phenomenon!

This new Science of Information Center will be very appreciate by scientist community.


Displaying 1 comment