acm-header
Sign In

Communications of the ACM

ACM News

Theoretical Computer Scientist Receives Waterman Award


View as: Print Mobile App Share:
Braverman is a problem-solver comfortable with attacking highly abstract questions, as well as real-world applications.

The U.S. National Science Foundation named Princeton University computer science professor Mark Braverman to receive the 2019 Alan T. Waterman Award for his research on developing algorithms and designs that can withstand the effects of ubiquitous noise p

Credit: Princeton University

The naming of Mark Braverman to receive the Alan T. Waterman Award for 2019 paid tribute not only to Braverman's stellar research, but also to his relatively small field of theoretical computer science.

The award is presented annually by the U.S. National Science Foundation to one or two researchers under age 40 (or not more than 10 years past their Ph.D.) and carries a grant of $1 million over five years. Because it is given across all areas of science and engineering, competition for the award is enormous.  Sharing the 2019 award with Braverman is Jennifer A. Dionne, an associate professor of materials science and engineering at Stanford University.

Braverman, a professor of computer science at Princeton University, is the third Waterman awardee in the field of theoretical computer science in the past decade (the others were Scott Aaronson in 2012 and Subhash Khot in 2010).  "This says something about its impact and importance," observed Avi Wigderson, professor of mathematics at the Institute for Advanced Study, who was awarded the Donald E. Knuth Prize for outstanding contributions to the foundations of computer science earlier this year. "The influx of talent to our field is just phenomenal.  [Braverman] is definitely of singular impact."

Wigderson remembers Braverman as a student whose maturity as a researcher came exceptionally early. Today, Wigderson says, "He is a young leader who is very broad and very deep and very strong technically." Of Braverman's strengths as a researcher, Wigderson said, "His breadth is one. His high-level understanding of the field and its growth is another. His ability to ask questions and to suggest models is extremely powerful."

The son of a mathematician and a theoretical physicist, Braverman was born in Perm, Russia in 1984. When he was eight years old, his family settled in Israel. After he finished his undergraduate degree at age 17 at the Technion–Israel Institute of Technology, his family moved to Canada.  He attended the University of Toronto, where he received his Ph.D. in 2008 under the direction of complexity theory legend Stephen Cook.

Complexity theory has focused mainly on discrete models of computation based on 0s and 1s.  As powerful as these models are, they do not adequately capture how computers are actually used in many areas, such as numerical analysis, which employs real-number algorithms to simulate scientific phenomena. A 1989 paper by Lenore Blum, Mike Shub, and Stephen Smale, which proposed what is now known as the BSS machine, led to a resurgence of interest in models for computing over the real numbers and the associated complexity theories.

In his thesis, Braverman greatly expanded our understanding of a different model, the bit-model for computing over the reals.  The key idea of the bit-model can be roughly seen in the example of the Euclidean plane: a two-dimensional set is bit-computable if a program can draw it on a computer screen, zooming in to any part of the screen to draw the set with arbitrary precision.

Braverman used the bit-model to explore the computational complexity of fractal objects known as Julia sets, which he called the "lab mice" of two-dimensional dynamical systems. "People were writing programs to produce those sets," Braverman said.  "Some of [the programs] were provably correct, and some of them were just heuristics." He proved that for Julia sets associated to certain parameters, producing pictures of the sets is computationally undecidable.

He also showed those parameters are rare, so people working in dynamical systems are unlikely ever to encounter them. "That's something that I have been thinking about ever since: what things are generically hard," Braverman said. "This exists, of course, in discrete computation also. We have SAT solvers that mostly work for a huge range of applications," even though SAT is NP-complete.

Braverman spent two postdoctoral years at Microsoft Research New England, where he worked with the healthcare group on algorithms for matching medical residents to U.S. hospitals, as well as using machine learning tools to study factors leading to patient rehospitalization. After that, "I realized that a lot of the questions are more economic and game-theoretic than computational," he said.  He then became interested in the burgeoning area of algorithmic economics, particularly mechanism design.

Sometimes called "reverse game theory," mechanism design studies ways to structure incentives so self-interested parties are driven to certain behaviors. For example, in a completely unregulated health insurance market, insurers have an incentive to recruit only healthy customers. One way to redirect that incentive is to design "a payment formula that would not encourage a drop in certain kinds of patients," Braverman said. He and his collaborators have designed such a formula.

While their work has the potential to improve the distribution of healthcare in the U.S., right now it remains at the conceptual level.  "It would be a full-time job for multiple people to actually get it to a point to where it could be used," Braverman said.  "Obviously there are bigger policy issues that we can't touch with essentially a prediction problem," he said, "but those concepts will be useful somewhere; if not in healthcare, then in some other area."

While he was at Microsoft Research, Braverman also became interested in another problem far from his thesis work, known as the Linial-Nisan conjecture, which arose in the area of pseudorandomness and had stumped researchers since it was first proposed in 1990.  Roughly speaking, it asks whether a certain very natural distribution would appear to be random to a class of computations.  No one had any idea how to resolve the conjecture, so Braverman's solution "came out of the blue," Wigderson said.  It was doubly unexpected because the ingenious solution, requiring great technical prowess, was so far from the areas in which Braverman had made a name for himself.

Braverman is a problem-solver comfortable with attacking highly abstract questions, as well as real-world applications. He also is a theory builder who has been at the forefront of recent developments in information complexity, an area that aims to bring ideas from Claude Shannon's information theory into the realm of computational complexity.

Information theory was launched in Shannon's seminal paper of 1948. That same year, John von Neumann mused about the theory of computation, noting that its roots in logic mean that "[i]t will have to be, from the mathematical point of view, combinatorial rather than analytical." This is a great limitation because mathematical analysis "is the technically most successful and best-elaborated part of mathematics," von Neumann said. Indeed, one of the great strengths of Shannon's theory is that it models a communication channel as a continuous resource, thereby bringing in the power of mathematical analysis.

The hope is that ideas from information theory can help make computational complexity less combinatorial and more analytical. Information complexity has existed as a research area for decades, but "somehow it wasn't as consistently fashionable within the theory of computation as one would expect," Braverman said.  "I have been trying to promote its use and find new uses for it." 

Wigderson portrays Braverman as a leader who "more or less single-handedly" revitalized this area of research.

Braverman's 2013 paper with Boaz Barak, Xi Chen, and Anup Rao gave impetus to information complexity by laying out the main definitions and problems and providing substantial initial results. Two years later, Braverman and his student Jon Schneider proved information complexity is computable.

Expanding the use of information theory and information complexity to more models of computation is a promising direction for future research, Braverman said. "Of course, many times in the past we have been stuck," he said. "It doesn't appear that we are closer to P versus NP than we were 50 years ago.  But in many intermediate models of computation, we have made a lot progress."

Braverman said he hopes to be able to nail down more specifically what makes those results so difficult, "because the stumbling block is also an explanation."

Allyn Jackson is a journalist specializing in science and mathematics topics, who is based in Munich, Germany.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account