The Washington Monthly - To Fix the Supply Chain Mess, Take on Wall Street

 

Stock illustration ID: 1312588591 ©iStock.com/tommy

Research associate Garphil Julien articulates how Wall Street financiers bent on short-term profits are largely responsible for America’s shortage of semiconductors and other key materials.

Last February, President Joe Biden issued an executive order commanding agencies throughout the government to report on ways to fix America’s supply chain mess. He ordered the secretary of the Department of Health and Human Services to tell him what to do about the country’s near-total dependence on China for the key ingredients needed to produce vital pharmaceuticals. He tasked the secretaries of the Energy and Defense Departments with finding solutions to our growing dependence on foreign corporations for the materials needed to make everything from electric vehicle batteries to computer-guided munitions.

And he ordered the secretary of the Department of Commerce to come up with solutions to what is perhaps the most urgent supply chain bottleneck of them all: the acute shortage of semiconductors that is driving up the prices and limiting the availability of a broad range of consumer goods, ranging from cars to TVs, laptops, phones, and even household appliances like washing machines and toasters.

When the reports came back 100 days later, the agencies listed a variety of different factors at work, but all agreed on one root cause. As the White House politely summarized it, the big problem was “misaligned incentives and short-termism in private markets.” Noting that America’s major corporations had spent the past decade distributing nearly all of their net income to shareholders in the form of stock buybacks and dividends, the White House concluded that “a focus on maximizing short-term capital returns had led to the private sector’s underinvestment in long-term resilience.” In other words, the ultimate explanation is Wall Street greed.

Yet strangely, when it came to proposing solutions, the White House had nothing to say about reining in the power of financiers over American business. Instead, the administration called for more government spending on science and technology, plus a wide range of new direct and indirect corporate subsidies for computer chip makers. Since then, with White House encouragement, a bipartisan coalition has passed a bill in the Senate, the U.S. Innovation and Competition Act, that takes the same approach. It offers more than $50 billion in subsidies to domestic semiconductor manufacturers, for example, but without taking any measures to ensure that the companies don’t continue to offshore production or use the funds to increase CEO pay or buy back their own stock. On November 15, Senate Majority Leader Chuck Schumer announced plans to push the bill through the House by attaching it to a must-pass defense policy bill.

There’s nothing wrong per se with government using corporate subsidies to achieve public purposes. But this legislation doesn’t address the core problem the administration itself identified. Over the past 40 years, financial deregulation combined with lax antitrust enforcement and poorly conceived trade policies have shifted power away from people who know how to invent, manufacture, and deliver products and toward people who know how to make money through financial manipulation. Until we take on that problem, our country’s ability to ensure uninterrupted access to the microprocessors and other vital components and raw materials on which our security and prosperity depend will become only that much more vulnerable to disruption and even collapse.

To see how this dynamic works, consider the fate of a company that not so long ago was a symbol of America’s absolute technological dominance of the digital age, from its inception in the early 1950s well into the first decade of the 21st century. The decline of Intel nicely illustrates what happens when a tech corporation pays more attention to raising its stock price than coming up with better products.

Many of the first chip technologies were pioneered by Intel. They included the 8088 microprocessors released in 1979, which long served as the foundation of personal computers. They also included the first commercially available dynamic random access memory chip, released in 1970, the basis for storing data on computer hard drives.

The personal computer revolution of the 1980s and ’90s created a boom in demand for Intel’s products, and for years the corporation prospered. But as time went by, Intel increasingly used its resources to focus on protecting its monopoly profits in the microprocessor market for PCs and on buying back its own stock to boost the price.

To protect its monopoly, Intel used illegal tactics such as loss leading and subsidizing the advertising costs of PC makers that used Intel chips. Through such maneuvers, Intel so weakened its one remaining U.S. rival, Advanced Micro Devices, that AMD was forced to sell off its own manufacturing facilities to the semiconductor firm Global Foundries, which was controlled by a state-owned investment firm in the United Arab Emirates. With AMD knocked down, Intel could deliver even more of the short-term profits Wall Street demanded.

Continue reading on The Washington Monthly here.