Day Eight: Legacy

In 1969 the terms “integrated circuit” or “computer chip” would have had no significance to most people. The Apollo program would change that…with lasting consequences for the world.

On July 24, 1969 — 50 years ago — Neil Armstrong, Buzz Aldrin, and Michael Collins re-entered the Earth’s atmosphere at a fiery 25,000 mph. The command module’s three chutes successfully deployed (there were only three people qualified to pack them, and NASA would not allow them to be in the same car together), and Apollo 11 floated down to the Pacific Ocean to mark the end of its history-making journey.

“Moon-Doggle”

On the Sunday before the launch of Apollo 17, Amitai Etzioni, the Columbia University scholar who had written the 1964 book, “Moon-Doogle”, penned a New York Times essay that opened with, “The most hopeful epitaph for Project Apollo might be: This was the last gasp of a technologically addicted, public-relations minded society, the last escapade engineered by an industrial-military coalition seeking conquests in outer space, while avoiding swelling needs of Earth.”

Technology Becomes a “Thing”

Etzioni’s assessment was harsh, but it wasn’t unique. Even before the success of Apollo 8, there were criticisms of the U.S. space program based on a similar theme. It was expensive, it was undertaken solely for U.S. prestige and military dominance, and the money could be better spent on a variety of domestic and foreign aid programs. There were many others, however, who championed America’s space program outside of its logical supporters within the NASA community. Jacob Bronowski, a British scientist and mathematician, came to a very different conclusion from Etzioni about the significance of the Apollo program. “I am not at all impressed with people who tell me it is useless. It is only useless if we do not know how to use the experience.”

Revisiting Etzioni’s criticism, there is a reference to an impact of the Apollo program that actually argues for Bronowski’s assessment: “technologically addicted.” The fact that the word “technology” would even stand on its own was a direct result of NASA and the Apollo program. It ushered in the Digital Age. By the end of the Apollo program, “technology” and its power to reshape our culture became a ubiquitous part of our lives. 

In 1965, Time Magazine devoted a cover to its story, “The Computer in Society.” The feature cited the number of computers in America at the time of its publication to be 22,500 (450 computers per state), of which 1,767 resided with the federal government. To put that figure in perspective, during Christmas 2017, Apple sold more than 35,800 iPhones an hour. Think about it. It took Apple 38 minutes to sell as many handheld computers as the U.S. had in total 52 years earlier. You can thank (or fault) the Apollo program for that. 

The Advent of the IC

The success of the Apollo program — and our ability to put a man on the Moon and bring him safely back before the end of the 1960’s — depended on computer chips. It needed integrated circuits to create its 1-cubic foot Apollo Guidance Computer (AGC), Apollo 11’s “fourth crew member.” The challenge facing NASA, in particular MIT, in “integrating” the integrated circuit boiled down to cost and reliability. 

MIT paid $1,000 each for the first 64 chips it purchased from Texas Instruments for testing purposes. In November 1962, MIT was paying less than $100 each, and by the middle of 1963 the cost was $15 — a reduction in price of 98.5% in three years. By 1965, the per chip price would fall another 50% to $7.28 — a 99.3% price reduction over a five year period. In 1969, the year of the Apollo 11 Moon landing, semiconductor chip prices not only decreased another 78% ($1.58 each), but were many times more powerful, and orders of magnitude more reliable. The American semiconductor industry was off and running.

It took NASA to jump start the chip industry. At the time MIT purchased its chips from Texas Instruments, there was serious debate among potential military and civilian buyers as to whether integrated circuits could be made reliable enough to gain wide acceptance. As a consequence, no major commitments had yet been made by customers in the private sector. In fact, in 1962 the federal government was responsible for 100% of the integrated circuits purchased in the entire world. As NASA’s orders drove production volume, commercial customers got in line, and just three years later the U.S. government accounted for only 72% of sales, even as the total production volume of ICs went up by a factor of 20. In short, the computer chips that went to the Moon created the market for the computer chips that did everything else — which gave America a serious leg up in the electronics industry. 

Moore’s Law and Its Consequences

In 1965, the vast majority of Americans could not have imagined what the implications of the integrated circuit would be. One person who did was the creator of the microprocessor, and the co-founder of Intel, Gordon Moore. In 1965, Moore presented a paper that would prove to be prescient, but at the time offered the best explanation for what was happening in the chip industry. The gist of his paper was to become known as Moore’s Law. Essentially, Moore’s Law predicted that the power of integrated circuits would double every two years for at least the next decade. More important, their ubiquity would increase as cost was driven down by economies of scale and improved reliability. Moore’s seminar paper stated, “The future of integrated electronics is the future of electronics itself…Integrated circuits will lead to such wonders as home computers — or at least terminals connected to a central computer — automatic control for automobiles, and personal portable communications equipment.”

NASA’s needs — the needs of the Apollo program — drove the American semiconductor industry to create the perfect chip, on which our modern digital economy is based. American semiconductor companies were the first to achieve that level of virtuosity, and the transformation they ushered in made it possible for them to dominate their industry for 50 years. In the process, the upstream companies they relied upon (materials and equipment manufacturers) and the downstream industries that benefited from their achievements made it more than likely that at this very moment you have integrated circuits on your person…or at least within easy reach. You can thank NASA for that.