acm-header
Sign In

Communications of the ACM

ACM TechNews

The 8080 Chip at 40: What's Next For the Mighty Microprocessor?


View as: Print Mobile App Share:
A closeup look at Intel's 8080 chip.

Descendants of Intel's 8080 microprocessor hold the promise of leading the way to further evolution of computer technology.

Credit: Intel

The Intel 8080 microprocessor, introduced in 1974, gave rise to the personal computer industry, and the descendants of that groundbreaking chip promise to lead the way to another 40 years of computer technology evolution.

"The last four decades were about creating the technical environment, while the next four will be about merging the human and the digital domains, merging the decision-making of the human being with the number-crunching of a machine," predicts industry analyst Rob Enderle.

Such merging is expected to involve people learning how to control machines via direct brain interaction, says Lee Felsenstein, who helped design early portable computers. He believes learning a computer/brain interface will be an interactive process starting in middle school and initially using toy-like systems. "A synthesis of people and machines will come out of it, and the results will not be governed by the machines nor by the designers of the machines," Felsenstein says.

Retired Intel chip designer Stan Mazor predicts, "when computers can see, we will have a large leap forward in compelling computer applications. Although typical multiprocessors working on a single task saturate at around 16 [central-processing units (CPUs)], if a task can be partitioned, then we might see 100,000 CPUs on a chip."

From Computerworld
View Full Article

 

Abstracts Copyright © 2015 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account