Modern computing systems owe a sizable debt to mathematician Alan Turing, whose breakthrough work set the direction that the future of computing would take by determining that no technique can be developed to ascertain in all cases whether a mathematical statement is provable or unprovable.
Turing devised a logical formalism describing how a human computer, trained to follow a complex series of mathematical operations, would actually execute them. The result is known today as a Turing machine, and Turing showed that this machine could solve any computing problem capable of being described as a series of mathematical steps, and that one Turing machine could model another Turing machine.
Turing's research proved to early electronic computer designers that calculating machines did not require a huge inventory of elaborate instructions or operations, but simply a few, always-available registers and a memory store capable of holding both data and code. The mathematical platform for today's digital systems was supplied by these insights.
Turing's work also yields insights on the cybersecurity challenge, as his proof of universal Turing machine equivalence is what makes it possible for an attacker to hijack a target computer and make it execute a program of the attacker's choosing.
From Technology Review
View Full Article
Abstracts Copyright © 2012 Information Inc. , Bethesda, Maryland, USA
No entries found