The indispensable and effective role of public policy in building the digital age
Market fundamentalism tends begrudgingly to grant a role for government spending in “basic research” but insists that the free market will carry scientific breakthroughs to scale and commercial success. This is not how innovation works, nor how it has worked in the past 100 years in the United States. For sophisticated, capital-intensive production processes, breakthroughs achieved primarily through publicly funded research rely on public coordination for commercial development, public funding for early-stage capital, and public contracts to stimulate demand.
Silicon Valley provides the quintessential case in point. Indeed, the region is only called “Silicon Valley” because the U.S. Air Force preferred silicon-based wafers in the early integrated circuits for which they were the primary customer. The online services and “apps” that pass for innovation in the Valley today are the happy byproduct of intensive investment in a far more technologically complex ecosystem developed over 50 years through government funding and decision-making, not venture capital and “permissionless innovation.”
As American policymakers have withdrawn investments beyond basic research, American innovation has stalled while technological leadership has migrated to other nations that learned the lesson America forgot. Every key underlying technology in the iPhone was developed with support from the American government, but the expertise, production, and continued innovation are all based overseas.
The federal government “picked winners” for most major technologies of the digital age. In directing government and military projects, public officials decided key aspects of new technologies—from the materials for chips to the protocols for networked computers.
At every stage of technological development, public investment and guidance play an indispensable role. The federal government is the only organization positioned to fund and coordinate academic and private-sector researchers as well as facilitate the initial application and commercialization of their discoveries. Such work does require coordination, and free-market price signals do not provide it.
- Government must identify promising technological platforms to prioritize for development. Breakthrough technologies tend to be developed in response to the demands of ambitious public projects or pressing military applications, not consumer preferences.
- Government must fund and coordinate the development of new technologies. Beyond merely funding R&D, government officials often mobilize the nation’s researchers and facilitate research networks that enable collaboration and improved efficiency.
- Government must create demand for new technologies and support scaling. Through contracting and procurement, the federal government acts as the “collaborative first customer” for new technologies with commercial potential but no immediate consumer market or scalable production.
By contrast, American reliance in recent decades on “permissionless innovation” in the “self-regulating market” has failed. The “design here, make there” business model adopted by America’s leading computer companies followed the lead of American policymakers, who continued to fund technological breakthrough but did not support the creation of domestic supply chains.
At the end of the Second World War, California’s Santa Clara County was known for its orchards, not its start-ups. The region that would become synonymous with American innovation had only a nascent technology sector, centered around Stanford University and Professor Fred Terman, who spearheaded a commercial-oriented program in radio electronics.1Michael Lind, Land of Promise: An Economic History of the United States (2012), 416. The venture capital industry did not exist, and the region’s few entrepreneurs were former academic researchers like David Packard and William Hewlett, both students of Terman.2Stuart W. Leslie, “How the West Was Won: The Military and the Making of Silicon Valley,” Technological Competitiveness: Contemporary and Historical Perspectives on Electrical, Electronics, and Computer Industries (1993), 76.
The lessons of the war and the demands of new great-power competition initiated a transformation. In a renowned report to President Truman, Vannevar Bush, head of the war-time Office of Scientific Research and Development, outlined an ambitious national science agenda focused on funding and coordinating the development of computer technologies.3Lind, Land of Promise, 397. By 1960, the Valley had become the “microwave capital” of the world, centered around the aerospace industry and its development of satellites for communications and reconnaissance. Lockheed was the region’s biggest employer—and would be through the end of the Cold War.4Leslie, “How the West Was Won,” 84. Even by 1992, Santa Clara County received more defense contract funding per capita than anywhere else in the country.5Robert D. Atkinson, “How the Government Built Silicon Valley,” The Globalist (March 15, 2012).
The federal government pushed forward the technological frontier of computing. Public funding sponsored roughly 25% of Bell Laboratories’ transistor research and more than half of R&D at IBM through the 1950s and, by the late 1950s, three-quarters of the nation’s computers were committed to public purposes.6National Academy of Sciences, Funding a Revolution: Government Support for Computing Research (1999), 86–7. Following the USSR’s Sputnik launch, President Eisenhower created the Advanced Research Projects Agency (ARPA) in 1958.7Defense Advanced Research Projects Agency, “The Sputnik Surprise” (Accessed November 17, 2022). The independent agency, which would become the engine of Space Race and Cold War innovation, was designed to spur “blue-sky thinking” beyond the traditional federal R&D and military procurement system. Rather than build its own laboratories, the program adopted a decentralized structure that gave its officers tremendous autonomy to coordinate, advise, and mobilize research efforts across firms, universities, and government labs.8Fred Block, “Innovation and the Invisible Hand of Government,” State of Innovation: The U.S. Government’s Role in Technology Development (2011), 8–9. In the early 1960s, ARPA alone funded 70% of all computer research in the U.S.9Nigel Cameron, “The government agency that made Silicon Valley,” UnHerd (June 18, 2018).
The federal government also used its procurement policies to act as a “collaborative first customer” in new fields where no commercial market existed and the capital requirements for scaling up manufacturing processes were prohibitive.10William H. Janeway, “The Forgotten Origins of Silicon Valley,” Project Syndicate (April 15, 2022). The major customers for transistors and, later, integrated circuits were NASA, the Pentagon, and defense contractors.11Phil Goldstein, “How the Government Helped Spur the Microchip Industry,” FedTech Magazine (September 11, 2018). The particular demands of military and defense applications shaped the trajectory of technological development and key features of early computing technologies.12Leslie, “How the West Was Won.”
As the nascent venture capital industry began to emerge, it was augmented by government funding that matched private investments. Beginning in 1958, the Small Business Administration operated a venture capital fund-matching program, investing two dollars for every venture capital dollar.13Philip McCallum, “The Small Business Investment Act of 1958—Its First year of Operation,” Virginia Law Review (1959). Venture capital would remain an important—albeit small—industry boosted by federal subsidies until the late 1970s, when regulatory reforms opened up a massive pool of institutional capital for venture capitalists.14Robyn Klingler-Vidra, “Building the Venture Capital State,” American Affairs (2018). By this point, the government had already funded the R&D for core technology platforms, sponsored advanced manufacturing, and facilitated the commercialization that made tech start-ups attractive to private capital.15Janeway, “Forgotten Origins.”
The government’s role as a purchaser, builder, and funder of technology created Silicon Valley. Santa Clara County earned the name Silicon Valley, not Germanium Valley, because of the Air Force’s specific preference for new silicon transistors, rather than standard germanium ones.16Christophe Lécuyer, Making Silicon Valley: Innovation and the Growth of High Tech, 1930–1970 (2005).
As the “collaborative first customer,” the federal government developed the American semiconductor industry.
The semiconductor innovation and production ecosystem, including the transistor, integrated circuit, dynamic random-access memory chip (DRAM), and microprocessor, was first developed and built in the United States.17Willy Shih & Gary Pisano, Producing Prosperity: Why America Needs a Manufacturing Renaissance (2012), 80–1. As new chips emerged from the lab, the federal government created demand for their production before the consumer market existed. Fairchild Semiconductor’s first contract was with defense contractor IBM, for high-voltage silicon transistors used in B-70 bombers’ on-board computers.18Computer History Museum, “1958: Silicon Mesa Transistors Enter Commercial Production” (Accessed November 17, 2022). Later, two procurement decisions by NASA and the Air Force for missile and space-guidance projects pushed chips into large-scale.19Anna Slomovic, “Anteing Up: The Government’s Role in the Microelectronics Industry,” The RAND Corporation (1988). NASA alone constituted 60% of the integrated circuit market in the 1960s.20Goldstein, “Microchip Industry.” Later in the 1970s, as fabrication became a bottleneck for chip development, ARPA created a university-affiliated lab that would fabricate any microchips with superior design.21Mariana Mazzucato, The Entrepreneurial State: Debunking Public vs. Private Sector Myths (2015), 84. ARPA likewise identified the need for new integrated circuit designs and launched its Very Large Scale Integrated (VLSI) Circuits program to coordinate research.22NAS, Funding a Revolution, 116-7.
The federal government funded the core technologies of the personal computer.
The vision of the personal computer, including monitors, keyboards, and “electronic pointer controllers called ‘mice,’” was first described by government officials at ARPA in 1968.23G.R. Fong, “ARPA Does Windows: The Defense Underpinning of the PC Revolution,” Business and Politics(2001). The agency would go on to fund research at Stanford University and MIT that would develop the first mouse-and-windows graphic user interface and single-user computer workstations.24NAS, Funding a Revolution, 109.
The federal government laid the technological foundations of the modern Internet.
In 1967, ARPA proposed a packet-switching network for its university-based laboratories to improve coordination and communications among researchers.25Lawrence Roberts, “The Evolution of Packet Switching,” Proceedings of the IEEE (1978). The computer network known as ARPANET functioned as a proto-Internet.26Janet Abbate, Inventing the Internet (1999). As a vehicle for experimentation rather than a fully developed service, the network was responsible for developing the first communication protocols, spawning network applications like file transfers and e-mail.27NAS, Funding a Revolution, 173–4. ARPANET both pioneered and publicized such technological breakthroughs, enabling later developments by NSFNET and the government-backed Project MAC at MIT that developed time-sharing networks and the first Internet protocols.28Ibid., 104. The TCP/IP protocols, on which the Internet continue to run today, were also developed by DARPA scientists.29Abbate, Inventing the Internet.
Without a focus on domestic production, federal policy enabled offshoring of critical technologies.
The breakthrough LCD technology underlying flat-panel displays was first developed by military-funded researchers at Westinghouse in the 1970s and later advanced through ARPA-backed R&D in the 1980s.30Mazzucato, Entrepreneurial State, 113. But without adequate government support, major American computer companies were uninterested in producing their own supplier base of flat-panel displays, instead preferring cheap imports from East Asia, where governments eagerly and aggressively supported the industry’s development.31Richard Florida & David Browdy, “The Invention That Got Away,” Technology Review (1991). To this day, all of the world’s flat-panel display factories are located in Korea, Taiwan, Japan, and China; none are in the U.S.32Shih & Pisano, Producing Prosperity, 12.