Microprocessor

Digital · Computation · 1971

TL;DR

Complete central processing unit integrated onto a single semiconductor chip, enabling personal computers and embedded computing everywhere.

The central processing unit—the heart of any computer—had always been built from multiple chips or boards. The idea of putting an entire CPU on a single integrated circuit seemed theoretically possible as transistor densities increased, but the engineering challenge was immense. Who would need such a thing, and who would build it?

The catalyst came from an unexpected direction: calculators. Busicom, a Japanese calculator company, contracted Intel in 1969 to design a set of chips for a new calculator line. Ted Hoff, assigned to the project, proposed a radical simplification: instead of twelve custom chips, why not create a general-purpose processor that could be programmed for different functions? Federico Faggin led the detailed design, and the Intel 4004 shipped in November 1971—2,300 transistors implementing a complete 4-bit CPU.

The adjacent possible required several streams to converge. Integrated circuit technology had advanced to the point where thousands of transistors could be fabricated on a single chip. MOS (metal-oxide-semiconductor) processing provided the density needed. The calculator market created customer demand. Perhaps most crucially, the conceptual leap from custom logic to general-purpose programmable processor required the right insight at the right moment.

Convergent development was evident. Texas Instruments developed the TMS 1000 around the same time, though it shipped later. Gary Boone had independently conceived the single-chip microprocessor. The technology was, in a sense, inevitable—multiple teams were approaching the same solution.

Intel quickly followed the 4004 with the 8-bit 8008 (1972) and the influential 8080 (1974), which became the basis for early personal computers. The 8086 (1978) launched the x86 architecture that still dominates desktop computing decades later. Each generation roughly doubled transistor count according to Moore's Law—Gordon Moore's observation that became self-fulfilling prophecy.

Geographic concentration was remarkable. Intel operated in Santa Clara. Fairchild Semiconductor, Intel's ancestor, was nearby. Texas Instruments in Dallas provided competition. The 'Fairchildren'—companies founded by Fairchild alumni—clustered in Silicon Valley, creating an ecosystem of chip design and fabrication expertise found nowhere else.

The cascade effects were epochal. Microprocessors made personal computers possible. They enabled embedded computing in everything from cars to appliances. They drove the digital revolution. The smartphone in your pocket contains processors orders of magnitude more powerful than the room-filling mainframes of the 1960s—all because someone realized you could put a computer on a chip.

What Had To Exist First

Required Knowledge

  • CPU architecture design
  • Integrated circuit layout
  • Logic synthesis and optimization
  • Instruction set design
  • Silicon fabrication process

Enabling Materials

  • MOS transistor fabrication
  • Photolithography for circuit patterning
  • Silicon wafer processing
  • Package and interconnect technology
  • Mask design tools

What This Enabled

Inventions that became possible because of Microprocessor:

Independent Emergence

Evidence of inevitability—this invention emerged independently in multiple locations:

Japan 1971

Masatoshi Shima from Busicom collaborated directly on 4004 design; Texas Instruments developed TMS 1000 in parallel

Biological Patterns

Mechanisms that explain how this invention emerged and spread:

Related Inventions

Tags