Chapter 2 - Background
2.25 The Microprocessor -- 1971
On November 5, 1971, Intel Corporation, a three-year-old start-up, announced the world’s first “micro-programmable computer on a chip” – the 4004 microprocessor. Claiming it would usher in “a new era of integrated electronics,” it was advertising hyperbole as master understatement. No one had any idea of the revolutionary potential of the microprocessor. But how could one? Something as small as the first three letters of the word ENIAC, yet equal in computational power to that thirty ton first computer. Intel management even debated whether to introduce the 4004 for uncertainty of market (in a year in which the installed base of computers totaled 88,000). Only eight years later, in 1979, 75 million microprocessors would be sold227 – 329,000 as microcomputers228 – four times the number of minicomputers (81,300) and forty five times the number of mainframe computers (7,300).229 In time, neither IBM nor DEC could withstand the technological discontinuity it both caused and represented.
The early microprocessors were not really computers on a chip, but only the central processing units, or CPUs – the CPU being where the programming instructions and data are brought together to be executed. For these microprocessors to qualify as a computer, additional chips were needed. In the case of the 4004, three other chips, two of them memory chips, one for data transfer and one for instruction transfer. Then the 4004 became the MCS-4, microcomputer system, also introduced on November 15. Microprocessors found home either as microcomputer systems or as embedded intelligence in OEM products – just like minicomputers.
Microprocessors started a new technology trajectory – the 4004 representing product instantiation. Before microprocessors could exist, advances in a number of enabling technologies were needed, including: semiconductor manufacturing, software and computer design. The 4004 required fabricating 2,250 circuit components on one chip that required Large Scale Integration (LSI), just happening in the early 1970’s. LSI made possible chip densities of up to 10,000 components whereas the previous generation – Medium Scale Integration – enabled only 1,000 components per chip. Software advances were needed to implement logic in software, not hardware i.e. transistors. And advances in computer design were needed to understand how to reduce a computer’s architecture to the essential – a path of development encompassing all computer design since the ENIAC.
The innovation of the microprocessor was motivated by an explosion in the number of integrated circuit chips, much as the integrated circuit was motivated by the impossibility of wiring thousands of transistors together. Beginning in roughly 1965, creating a computer meant selecting an architecture and then implementing the architecture in logical units with integrated circuits. Since there were an unlimited number of computer architectures and logic units, the number of different integrated circuits exploded. But as the number of unique integrated circuits grew, the volume of any given one went down, which drove manufacturing economics not towards high volume, low cost chips but to the opposite – expensive chips.230 And as in the early days of integrated circuits, buying demand growth required low costs, made possible by high volumes, which were driven by large buying demand – another catch-22.
Different solutions were proposed by almost every manufacturer. The most promising combined standardization of integrated circuit chips with sophisticated design software. Not surprisingly, firms manufacturing integrated circuits favored this solution. But this approach did not promise simpler designs, and less chips, unless everyone adopted the same design, an unlikely outcome in a competitive world. This was exactly the case in calculators. Integrated circuits revolutionized calculators from electro-mechanical monsters barely able to be carried around to hand-held units powered with batteries. In 1968, any calculator company wanting to remain competitive had to use integrated circuits.
It is worth pausing for a moment. 1968 will emerge as a very important year. It marks, for example, the beginning of the larger history being told of computer communications. One of the events making 1968 special was the founding of Intel Corporation. The founders of Intel saw an opportunity to innovate semiconductor memories, which it did. Intel then innovated the microprocessor. Radically innovating two technologies makes Intel very special – few firms radically innovate more than once. It also just happens that the two innovations, along with software, are essential to information technology products. A continuing subject in future chapters will be the competitive dynamics of the semiconductor market-structure and their influence on the emerging dynamics of computer communications.
In 1968, Robert N. Noyce and Gordon E. Moore resigned from Fairchild Semiconductor, a company they helped found, to start Intel Corporation. They left Fairchild, then a $150 million company,231 because they felt bogged down in the bureaucracy of a large company – Noyce was managing 15,000 employees232 – and because they wanted to participate more directly in innovating semiconductor technologies. They targeted semiconductor memories – believing both buying demand existed for semiconductor memory instead of core memory and memory design would be the first to benefit from the coming semiconductor process advances to LSI. In short, they saw an economic opportunity. They then easily raised $2.5 million and were on their way. When vision is combined with the experience and the drive to make it happen, the will to act, the entrepreneur is born – or, as in this case, born again.
Before the end of the year, Intel announced a one thousand bit (1K), random access memory (RAM) chip named the 1103 – or 1103 1K RAM.Although not the first 1K RAM introduced, the 1103 would soon dominate. Its success was due to a number of factors, including: the competence and creativity of the personnel of Intel, the quality of the design, second sourcing, and manufacturing prowess.233
In the summer of 1969, Busicom, a now defunct Japanese calculator company, needed a source of integrated circuits with which to build calculators before their existing electro-mechanical models became obsolete. (To the calculator market-structure, the integrated circuit represented a technological discontinuity just as it did to computers.) Busicom contacted Intel. They wanted twelve different chips designed and manufactured. Assigned to lead the project was Marcian E. “Ted” Hoff, Jr., a recent graduate of Stanford and Intel’s twelfth employee. He assessed the design as too complicated, requiring chips of 3000-5000 transistors, and beyond Intel’s manufacturing capabilities.234 There the story might have ended were it not for the persistence and creativity of Hoff.
As Hoff struggled with the design constraints of an architecture simple enough to be manufactured, yet complex enough to meet Busicom’s requirements, he reflected on the architecture of the DEC PDP-8 minicomputer he was using. The PDP-8 had a very limited set of instructions, but because it had lots of memory complicated control and logic operations were possible. If he could reproduce that design, a few instructions with sufficient memory, maybe the design would both satisfy Busicom and be general enough to meet other design objectives. To Hoff, the fact that Busicom had exclusive rights to the design, had no affect on his search for an optimum solution.
Noyce and Moore, fully knowledgeable of the catch-22 confronting integrated circuitry, liked Hoff’s idea and authorized him to proceed. At the same time, manufacturing within Intel perfected a new process to create reliable chips of 2000 transistors.235 As Hoff pressed forward, knowing he had to limit the design to roughly two thousand transistors per chip, the Busicom design team continued down the original path. In the late fall, the two designs were presented to Busicom executives who selected Hoff’s design.
Then came the challenge of reducing the architectural design to silicon – working chips. Little progress was made until Federico Faggin came aboard from Fairchild in the spring of 1970. In less than nine month’s, samples of the four chips of the MCS-4 were working. But competitive conditions in calculators had driven end user prices to unexpectedly low levels, and Busicom had no choice but to renegotiate the contract with Intel in order to buy the chips at significantly lower prices. In renegotiating the contract, Intel regained rights to sell the chips to others for non-calculator applications.236
Having the rights did not mean they would be exercised – especially when many members of management and the Board of Directors were uncertain if enough buying demand existed to warrant the investment of organizational and financial resources to launch and support the product. Ed Gelbach, having recently joined Intel from Texas Instruments as senior vice president responsible for marketing, argued new applications were the issue – not the two thousand units calculated assuming a ten percent share of the minicomputer market. His and Arthur Rock’s, the Chairman of the Board’s, arguments proved crucial. Intel introduced the first microprocessor and microcomputer system on November 15, 1971. By February 1972, an encouraging $85,000 of MCS-4 chip sets had been sold.237
During this same period, Intel also had under development an 8-bit microprocessor. In late 1969, Computer Terminals Corporation (CTC), later to be renamed Datapoint Corporation, approached Intel to design and manufacture a LSI chip for a new intelligent CRT terminal.238 Intel proposed a one chip design. CTC also solicited a proposal from TI. In March 1971, TI demonstrated their chip to CTC. Neither the Intel nor TI chips were used; CTC felt pressed to bring their terminal to market before production quantities of either chip would be ready. (TI was awarded a patent in 1978 for their design – proving the essential circuitry of a microcomputer could fit on one chip.239 ) Intel continued their development program and in April 1972 introduced the first 8-bit microprocessor – the 8008.
The challenge confronting Intel now switched from development to supporting companies and engineers intrigued with using microprocessors, but having no experience or training in their use. Beginning in 1972, Intel launched a steady stream of software and hardware products to aid those developing applications for their microprocessors. Even that was not enough. In 1973, to both sell and support the idea of substituting microprocessors for hardware logic, Intel began a promotion campaign directed to explaining the advantages of microprocessors and inviting engineers to attend seminars and customer training programs.(Engineering programs at leading universities began to offer courses on microprocessors and microcomputers during this period.)
Intel may have led the way but by July 1974 they were far from alone. A total of nineteen microprocessors were available or announced.240 Early entrants following Intel were: Rockwell, Fairchild, National Semiconductor, Signetics, Toshiba, AMI, Teledyne Systems, and, of course, TI. By mid-1975, forty microprocessors existed and by 1976, the number had grown to fifty-four. The microprocessor trajectory had clearly entered the competitive stage. As Noyce and Hoff write of these years in 1981: “It was clearly a time of wide-ranging experimentation in architectures, processes, and packaging.”241 These years, 1973-74, also saw manufacturers first unable to keep up with demand, and then having to adjust to a mini-recession in 1974-1975. Nevertheless, the acceptance of microprocessors continued to grow. DEC would introduce a series of microprocessor modules using Intel 8008’s in March 1974. Revenues of microprocessors totaled $37.7 million in 1974; a year in which the installed base of microprocessors exceeded both those of minicomputers and mainframe computers.242 Projections of revenues were $50 million in 1975, $150 million in 1976 and $450 million by 1980.243
In 1974, Intel announced the first second generation 8-bit microprocessor – the 8080. Development of the 8080 was not intentional, or at least it did not start out that way. Rather it began as an effort to ascertain if recently advanced fabrication technologies used in memory production would improve the performance of the 8008.244 But it quickly became obvious to do so required significant layout changes of the 8008. Aware the chip had serious design flaws, the decision was then made to re-design the chip with an objective of improving its performance by a factor of ten.245 Even before 8080 fabrication began, Intel management knew they had a winner, and for the first time pre-announced a product. Its immediate acceptance proved the demand for a high-performance 8-bit microprocessor was larger than anyone had expected. Two chips threatened market leadership of the 8080– the Motorola 6800 and the Zilog Z-80.
The Motorola 6800, introduced in mid-1974, proved important for at least two reasons. First, it began a line of chip development that would become Intel’s most serious competitor, although no one knew it at the time. (This fact notwithstanding, the Motorola chip family will occupy only small mention in this work for they found their welcome home not in the office market but in the technical-scientific market, and had little direct impact on the course of computer communications.) Second, the 6800 began the practice of introducing microprocessor development tools and microprocessor supporting chips simultaneously, not over time as was then the case.246 This change significantly raised the costs of market entry and peaked product development investment.
Even though the Zilog Z-80 was not introduced until 1976, it nearly changed the history of microprocessors. The story begins in 1972 when Intel hired Ralph Ungermann, the same who will come to found Ungermann-Bass. Intel needed someone to plot their entry into the market for voice communications chips, and Ungermann, who had worked for both Collins Radio and Western Digital,247 wanted a job, almost any job, as long as it was with Intel – he even agreed to compensation less than he had been making. Once aboard, Ungermann soon convinced management to give him responsibility for microprocessor development systems – one working closely with Faggin. In 1974, needing money, Ungermann exercised some of his stock options before leaving for a July 4th vacation, only to find on his return that Intel’s stock price had dropped significantly below the price he had paid to exercise his options. With the interest expense on the loan incurred to pay for the stock now higher than his salary, Ungermann had no choice but to leave Intel in search of a better paying job.
Ungermann’s first impulse was to join the logic simulations software company he had helped start some years earlier – turning it into a success would certainly help his finances. But within days Faggin contacted Ungermann to tell him he had also left Intel. Would he be interested in doing something together? Almost before they could get together they received a call from Exxon Enterprises, a new division within Exxon, the largest corporation in the world. Learning of their departure from Intel in the weekly trade magazine, Electronic News, Exxon wanted them to start a semiconductor components company to support their intended office systems businesses. Ungermann remembers: “Exxon approached us and said: “Don’t you want to be in the semiconductor business?” We said:”No, we didn’t want to,” and Exxon fundamentally talked us into starting Zilog.”248 Exxon would own half the company and the two founders one-fourth each.
Zilog’s initial strategy was to innovate an 8-bit microprocessor and some of the high-value peripheral chips needed to support the microprocessor – especially communications chips. Faggin took responsibility for the microprocessor and Ungermann the rest. Masatoshi Shima, the lead designer of the 8080 and contributor to the 8008 design, contacted Faggin and soon joined them. The decision of microprocessor architecture couldn’t have been more obvious, do a chip compatible with the 8080 – only better. Designing the chip and perfecting a third-part fabrication process took eighteen months. In 1976, Zilog announced the Z-80, an 8-bit microprocessor. The sales pitch: replace your 8080 with a higher performance chip without having to redo one line of software code. It proved compelling.
But by 1976, the next generation of microprocessors, those of 16-bit architectures, were already in the market. National Semiconductor was first, announcing their Pace chip in 1974. By 1976, TI (9900) and General Instruments (CP1600) had also entered the market.249 Intel, which began developing a 16-bit design in 1975, introduced their chip, the 8086, in late 1978. The 8086 exceeded the performance of the 8080 in every respect – for instance it could address one megabyte of memory not 64 K bytes – yet with 29,000 transistors required a chip die only 27 percent larger than the 8080.250
Meanwhile, the Z-80 began to out sell the 8080. Zilog seemed to have the momentum and it would come down to who had the best 16-bit chip – Intel or Zilog. But before Zilog could even introduce their product, internal organizational conflicts erupted. Ungermann and Faggin were at odds, it seemed one of them had to go. In December of 1978, the Zilog Board of Directors decided in favor of Faggin with the proviso that a new president be hired. On January 1, 1979, Manny Fernandez, then head of the Discrete Products Group at Fairchild, became President of Zilog. Soon thereafter, Zilog announced their 16-bit microprocessor – the Z-8000. Only the Z-8000 was not compatible with the Z-80, and although in many ways very competitive with the 8086, if customers had to re-write their software to up-grade to a 16-bit processor, their choice was simple – Intel over Zilog. Ungermann summarizes succinctly: “It killed the company. That was the fatal mistake – making the Z-8000 incompatible with the Z-80.”251 No one was more surprised than the top management of Zilog who thought they had approved a Z-8000 design that was compatible with the Z-80. (Technically, Zilog did not die, in fact, years later it would become a publicly traded company. What the Z-8000 did kill was any chance to overtake Intel in microprocessors – an event that would have been improbable anyway.)
In achieving a dominant position with the 8086, Intel had truly institutionalized its microprocessor competence – it was an organizational competence not just that of select individuals. Intel had lost the two individuals who had led the design of their last two microprocessors – 8008 and 8080 – and even though those individuals collaborated to innovate the next generation competitive chip, Intel still produced the better chip. When organizations transition from being a collective to being more than the sum of its parts will be a re-occurring subject of interest in this work.
Intel had won the battle of 16-bit designs. In 1979, they introduced the 8088, an 8-bit processor with a 16-bit architecture – making it compatible with 8-bit 8080 software yet approaching the performance of a 16-bit processor. In a decision to be made in the next year by IBM, the 8088 would become the dominant design, and with it, so would the microprocessors of Intel. But before IBM announced the IBM Personal Computer, or PC, there remained one more vision of computing to be created: personal distributed computing. It is a story that began before microprocessors were ever conceived.
- [227]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 8
- [228]:
A microprocessor sold as a computer, not embedded in some other product, is a microcomputer.
- [229]:
CBEMA, Industry Marketing Statistics
- [230]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 8
- [231]:
Gene Bylinshy, “Here comes the second computer revolution,” Fortune Nov 1975, p. 143
- [232]:
Gene Bylinshy, “Here comes the second computer revolution,” Fortune Nov 1975, p. 144
- [233]:
Intel then announced a 4 K RAM in July 1972. Only this time, Texas Instruments , a leading integrated circuit company and much larger than Intel, announced a better 4 K RAM the following month.
- [234]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 9
- [235]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 9
- [236]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 13
- [237]:
Ibid.
- [238]:
Stephen P. Morse, Bruce W. Ravenel, Stanley Mazor and William B . Pohlman. “Intel Microprocessors – 8008 to 8086,” Computer Magazine Oct 1980, p. 43
- [239]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 16
- [240]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 14
- [241]:
Ibid., p. 15
- [242]:
Ibid., p. 17
- [243]:
Gene Bylinshy, “Here comes the second computer revolution,” Fortune Nov 1975, p. 138
- [244]:
From PMOS to NMOS.
- [245]:
Stephen P. Morse, Bruce W. Ravenel, Stanley Mazor and William B . Pohlman. “Intel Microprocessors – 8008 to 8086,” Computer Magazine Oct 1980, p. 47
- [246]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 15
- [247]:
At WD he worked on the UART chip designed by Gordon Bell at DEC.
- [248]:
From interview with author.
- [249]:
Robert N. Noyce and Marcian E. Hoff, Jr., “A History of Microprocessor Development at Intel.” IEEE Micro, Feb 1981, p. 18
- [250]:
Ibid
- [251]:
Excerted from an interview with author.