If world leaders learn only one lesson from the war in Ukraine, it should be that the ability to rapidly innovate—to invent, adopt, and effectively integrate new technologies—can have profound implications for combat outcomes. Outgunned and outnumbered, Ukrainians took a page from the U.S. playbook and turned to technology to gain an advantage over the adversary. They deftly adopted Starlink satellite communications, turned commercial drones into flying bombs, and quickly embraced unfamiliar Western weapons to keep Russia from wiping them off the map.
Throughout the history of war, a decisive factor in conventional conflict has always been the human ability to innovate—to invent and make effective use of that new technology. For the past 70 years, nuclear weapons technology has stood alone in its unique ability to independently change the course of history. Now, artificial intelligence (AI), quantum computing, and other related technologies increasingly make it possible for machines to innovate much faster and more efficiently than humans ever could. These technologies have the potential to act as a central force in international politics; winners of the tech race will shape the international order, while losers will sit on the sidelines, unable to ensure their survival, let alone their prosperity.
Both the United States and China are racing to develop AI and other emerging technologies to gain a competitive edge in an ongoing series of global contests over power, security, wealth, influence, and status. Some of these technologies could take on nuclear weapons–like qualities in their ability to independently shape a state's economic, political, and military future; most will develop more slowly or have less singular, but nonetheless important, impacts on the security of U.S. allies and partners. A primary responsibility of the U.S. government—and specifically, the U.S. Department of Defense (DoD)—is to identify and develop the technology that is most likely to advance U.S. interests vis-à-vis China and ensure the United States stays ahead in these key areas. To do this, the Pentagon can draw some important lessons by turning back to America's last epic technological competition with a peer adversary: the U.S.-Soviet race to develop nuclear weapons during the Cold War.
The Shape of Things to Come
Some reject framing the U.S.-China technology contest as a new tech Cold War. Of course, it's true the world is not neatly divided into blocs, and that nation states no longer have the monopoly on highly lethal weapons. A variety of state and nonstate actors now have access to low-cost, lethal, and commercially available technology, such as drones, that they can use to win their wars. But a Cold War framing offers some useful and tangible ways to advance U.S. understanding of the nation's position relative to China and what needs to be done to maintain a U.S. advantage. Viewed through this lens, it's apparent that Washington and Beijing have already adopted different approaches to achieve technological dominance that reflect fundamentally different views of the world and offer them different advantages.
Although the current U.S. administration is looking for areas of cooperation with Beijing, the two countries are on divergent paths.Share on Twitter
Although the current U.S. administration is looking for areas of cooperation with Beijing, the two countries are on divergent paths. Washington's approach is to limit the flow of technology to China, reshore and “friendshore” some technology supply chains—notably semiconductors; and invest in U.S. tech innovation at home. Meanwhile, Beijing is charting its own independent course to displace the United States as a world leader, in part by reducing its reliance on U.S. technology and spreading its technology-driven authoritarianism across the globe. The Chinese Communist Party is tightening links between industry and the state to sharply focus and control its tech innovation in areas such as semiconductors. It is leveraging its technology to expand the surveillance state into Africa, Europe, and the United States under the guise of economic development. And it has engaged in a comprehensive campaign to steal U.S. technology secrets. The macro effects of this competition are starting to reveal themselves: U.S.-China trade is at a 20-year low and economic coercion between Washington and Beijing has become common practice.
In this race for a technological edge, the United States enjoys certain foundational advantages over China. The West's vibrant open markets and liberal democracies offer a time- and empirically-tested means to supercharge innovation and productivity. U.S. allies and partners want U.S. technologies because they understand this advantage and respect it. The Western way of innovation promises to be especially conducive to technologies that require decentralization, experimentation, and iteration, such as AI. The United States may lag behind China in AI publications and patents, but U.S. companies have developed the most advanced and widely used large language models, ChatGPT and Claude.
Of course, China, with its centralized governance structure and command economy, could still be the dark horse in this technology contest. In 2019, Beijing famously “5-G'd” Washington when state-owned Huawei and ZTE undercut Western competitors to dominate ally communications networks and standard setting bodies. China released an AI strategy four years before the U.S. National Security Commission on Artificial Intelligence released its final report. In 2021, China gave the United States another Sputnik-like moment when it tested a new orbital bombardment system, capable of evading U.S. missile defenses to deliver a surprise nuclear strike on the U.S. homeland. As China's economy slows and its population ages, it might be more likely to exploit its newfound technological advantages in destabilizing ways before its window of opportunity closes.
Picking Tech Winners
In this competition for global influence, it's critical for the United States to stay ahead of China on technologies that truly have the power to shape the international order. This will require a systematic, ongoing approach to identifying and procuring those technological capabilities that have the best chance of countering China's aggression and effectively defending the West, given the resource constraints that confront the U.S. military today. Here, too, the Pentagon's approach to analysis during the Cold War offers a useful template.
The DoD is a central player in shaping the trajectory of the U.S.-China technology competition because it has a budget upward of $800 billion and a track record of identifying and producing technologies that can contribute to U.S. war-winning potential and change the way Americans live, such as stealth, precision guided weapons, satellites, even the internet (PDF). These choices were driven by more than 45 years of analysis that began in 1945 and culminated with the 1991 U.S. invasion of Iraq, which vindicated the U.S. military's post–World War II strategy predicated around technological superiority.
Cold War analysis of the U.S.-Soviet military balance was ultimately a success because it was conducted as a campaign. The U.S. government showed a remarkable capacity to take in new information, revise hypotheses, and gradually focus on distinct operational problems over a protracted time period. Ultimately, this effort led to development of the U.S. precision strike regime, a network of command and control, satellites, guided weapons, and stealth that was employed to rapidly decimate a large and capable Soviet-style Iraqi military in 1991. But the most important lesson for today's tech Cold War is not the ultimate success of the analysis, but the painful, messy path it took to get there.
Early Cold War assessments were often plainly wrong, either because they lacked sufficient empirical data on Soviet forces, employed the wrong metrics, or became coopted by political agendas paranoia. The U.S. public had to live through the hysteria caused by the “bomber gap” and the “missile gap,” perpetuated by pessimistic, inaccurate “bean counts” of U.S. versus Soviet forces before improved strategic reconnaissance capabilities and more-sophisticated analysis could put those fears to rest. Disagreements about the ultimate end strength of Soviet conventional forces and the size of Soviet GDP devoted to the military pervaded the Pentagon right up until the end of the Cold War. Yet in the end, this campaign, with all of its messiness in the middle, was just “right enough” to win America's next major war—even if no one could predict it would take place in the desert of Iraq rather than central Europe.
In today's U.S.-China technology competition, making bets about which technologies will most readily serve U.S. economic, political, and security interests is even more fraught. The Pentagon is currently racing to invest in no less than 14 critical technologies to bolster national security. Almost all of them, with perhaps the exception of hypersonic weapons and directed energy, are “dual use,” meaning they could have applications for civilians and the private sector as well. Adding to the confusion, these digital age technologies may be most powerful when they are developed in certain combinations, but there are handfuls of subdisciplines for each technology and different ways to measure progress across them. As a result, it's hard to say who is “winning,” or why that would matter to the United States.
To maintain an advantage in the U.S.-China tech competition, the United States should embrace the Cold War example: Failure has to be a natural and accepted part of the process of discovery. Scientific methods build knowledge by falsifying alternative hypotheses through experimentation. What are accepted as scientific “facts”—whether it's “China is ahead on AI,” or “the Soviets have more bombers than we do”—are just hypotheses that researchers have not yet proven wrong. The power of this approach is that it leaves room for error and correction; and in doing so, it gives the United States the space to test new ideas and eventually allows the United States to focus on the right answers.
To maintain an advantage in the U.S.-China tech competition, the United States should embrace the Cold War example: Failure has to be a natural and accepted part of the process of discovery.Share on Twitter
Scientists and engineers unfettered in developing their ideas have been essential to U.S. military strategy and force development since WWII. Industrialists willing to take big investment risks on untried technologies have been equally important, as has a DoD that has proven capable thus far of discerning better bets among the competing alternatives. As in the Cold War, a systematic campaign of analysis, centered on scientific discovery and engineering innovation, will play a central role in determining the outcomes of the U.S. and China competition. This approach, characterized by open debate, risk-taking, and the acceptance of some failure, is strongly embedded in Western culture and can be leveraged as a decisive strength in the U.S.-China tech competition. The values embedded in the campaign of analysis—initiative, risk-taking, and freedom to debate—are as important to U.S., ally, and partner security today as they were in 1948.
The Cold War between the United States and Soviet Union is not a perfect analogy for the U.S.-China technology contest. But it does provide a lens that highlights the importance of understanding the other side as a competitor and engaging in a systematic scientific process to identify technologies that might allow the United States to gain an advantage. Layers of discovery will build on each other, starting with a better understanding of each country's current approach to technological innovation, and gradually focusing in on specific military, economic, and political problems that each side might employ technology to solve. As the contours of these problems become clearer, the United States will increasingly be able to identify and exploit asymmetries between U.S. and Chinese strategies and technologies to gain an advantage in the contest for global influence.
Caitlin Lee is a political scientist and director of the Acquisition and Technology Policy Program at the nonprofit, nonpartisan RAND Corporation.
Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.