Engineers at American innovation hubs such as Bell Labs, IBM, Texas Instruments, and Fairchild Semiconductor led the development of the transistor and the integrated circuit in the middle of the last century. Robert Noyce, credited with first deploying such a circuit on a silicon chip, went on to co-found Intel with Gordon Moore, whose famous “Moore’s Law” predicted the doubling of transistors on a chip every two years. The United States dominated the market for semiconductors, particularly at the high end, with Intel controlling 90 percent of the market for PC microprocessors in 1997. “Intel sets release dates for new chips,” wrote Time magazine that year, “dictating the pace of the computer industry with the confident aplomb of fashion designers raising or lowering hemlines.”
While semiconductors have become ever more important to modern civilization, providing the brains for not only computers but also phones, cars, and missiles, Intel no longer sets the standard. That distinction belongs to the Taiwan Semiconductor Manufacturing Company (TSMC), which manufactures chips with transistors just five nanometers (nm) wide. Intel remains mired two generations behind, at 10nm, and won’t achieve 7nm until 2023 at the earliest — by which time TSMC will likely be at 3nm. Globally, the U.S. share of semiconductor manufacturing has fallen by two-thirds since 1990, to 13 percent, and is still falling.
The decline of America’s semiconductor industry is in many ways the story of the nation’s major economic missteps in recent decades, as economists misunderstood the dynamics of globalization, policy-makers ignored the importance of technological leadership, and business leaders chose shareholder payouts over investment. Michael Boskin, chairman of George H. W. Bush’s Council of Economic Advisers, famously quipped, “Potato chips, computer chips, what’s the difference?” Quite a lot, it turns out.