acm-header
Sign In

Communications of the ACM

ACM TechNews

Turing's Enduring Importance


View as: Print Mobile App Share:
Alan Turing

Did Alan Turing's 1936 paper 'On Computable Numbers' influence the early history of computer building?

Modern computing systems owe a sizable debt to mathematician Alan Turing, whose breakthrough work set the direction that the future of computing would take by determining that no technique can be developed to ascertain in all cases whether a mathematical statement is provable or unprovable.

Turing devised a logical formalism describing how a human computer, trained to follow a complex series of mathematical operations, would actually execute them. The result is known today as a Turing machine, and Turing showed that this machine could solve any computing problem capable of being described as a series of mathematical steps, and that one Turing machine could model another Turing machine.

Turing's research proved to early electronic computer designers that calculating machines did not require a huge inventory of elaborate instructions or operations, but simply a few, always-available registers and a memory store capable of holding both data and code. The mathematical platform for today's digital systems was supplied by these insights.

Turing's work also yields insights on the cybersecurity challenge, as his proof of universal Turing machine equivalence is what makes it possible for an attacker to hijack a target computer and make it execute a program of the attacker's choosing.

From Technology Review 
View Full Article

Abstracts Copyright © 2012 Information Inc. External Link, Bethesda, Maryland, USA 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account