1 TECHNOLOGY AND NATIONAL POWER

In 1927 Mao Zedong, China’s first communist autocratic dictator, said “Political power grows out of the barrel of a gun.”1 He knew what he was talking about. Twenty-two years later his Chinese Communist Party (CCP) won China’s civil war and Mao became the supreme leader of the world’s most populous country. His current autocratic successor, Xi Jinping, would probably update Mao’s aphorism to “civilian and military power is generated by the semiconductor chip,” in response to the embargo certain democracies have placed on the export of semiconductor chips (SCs) to China. A few years ago, Vladimir Putin (Russia’s current autocrat) said whoever masters artificial intelligence (AI) will rule the world.2 Mao and Putin were correct, and Xi would be right to emphasize microelectronics, because today and for the duration of Cold War 2.0, technology innovation is and will be the primary source of national economic, military, and soft power.

To fully grasp what Mao, Xi, and Putin were getting at, and thereby to come to grips with the civilian and military global geopolitical power dynamics at play in Cold War 2.0, it is imperative to understand the mechanics of innovation, industrial capacity, hyper-innovation, digital innovation, and the important role played by “competitive displacement.” Some consideration, though, must also be given to the cost of, and value derived from, military innovation, lest an ever-growing defense budget end up devouring the whole of government, like a huge black hole. Finally, Japan’s experience with economic and military power over the past 150 years, derived largely from technology innovation, is instructive. Could it usefully serve as a model for China?

CIVILIAN TECHNOLOGY INNOVATION; COMPETITIVE DISPLACEMENT

The Sony Discman, later renamed the CD Walkman, debuted forty years ago. It was revolutionary. It could play a compact disc while the user was walking outside, with the Walkman slung over the shoulder. It skipped a bit (which was annoying) if the user walked too fast, or jogged, but that was a small price to pay for music mobility. In 2001 (twenty-two years after the audiocassette-playing Walkman first appeared), there appeared the Apple iPod. It was one of those wonderful moments in modern tech history when an innovation scratches an itch perfectly. The iPod was small. It was fully digital. And it conveyed a magical power on the user—they could make up, and mix up, their own music playlists. Miraculously, it didn’t skip, even when it was taken jogging or skiing. About 450 million people fell hopelessly in love with the iPod, and its dazzling migration of recorded music from a mechanical format to the digital.

Not surprisingly, soon after its initial commercial release the iPod competitively displaced the CD Walkman. This happens in economies with open markets, especially in democracies, when a new and better invention completely replaces an incumbent device, machine, process, or technology. Apple, the creator of the iPod, has a special knack for competitively displacing other products. Canadians were quite miffed when the Apple iPhone, around 2011, displaced the made-in-Canada BlackBerry mobile phone. Eventually, though, even Canadians got over it—superior technology has a way of generating infidelity toward one’s national tech icons. The iPhone even made the iPod redundant, given that the iPhone came with a music app. Love of the iPod got transferred to the iPhone.

Competitive displacement operating in an open market produces incredible wealth. Apple is the most valuable tech company on earth, with a market capitalization hovering in the range of $2.5–$3 trillion,3 and a brand that oozes leadership, cachet, and tech smarts. Not surprisingly, the open market has produced competitors to the iPhone (more evidence of attempted competitive displacement), especially South Korea’s Samsung Galaxy. These two smartphones, and some less expensive ones from Chinese suppliers Xiaomi, Oppo, and Vivo, have also triggered a tsunami of wealth creation by their billions of users. Every tile on the homepage of a smartphone represents another business that has been competitively displaced (who buys paper maps anymore?), but each such e-business has also created thousands of new jobs, and hundreds of yet more opportunities (without the smartphone there would be no Uber, no DiDi in Asia, and so on). The smartphone has made Apple and Samsung shareholders a lot of money, but they have also brought enormous opportunity and wealth to their users as well.

Consider the self-employed sesame seed farmer in Sudan (the world’s leading country for sesame seed production). Before the smartphone, when she harvested her sesame seeds, she would take her crop into town on her bicycle and sell it on terms dictated by the monopoly wholesale buyer. It’s not hard to figure out who got the better end of that trade. Today, the farmer uses her smartphone to check the prices paid by wholesalers for sesame seeds in three other towns. The smartphone has given her alternatives and opportunities. Competition, produced by her smartphone, is working for her. The extra income generated by her smartphone has allowed her to upgrade her pedal bike—her only form of transport—to an e-bike. She’s planning on planting more sesame plants and hiring her first employee.

The core building block of the smartphone is its high-end semiconductor chips. These SCs choreograph the magic dance of the applications that delight the user of the smartphone. The SC does likewise with literally every electronic device today, including the automobile, the espresso maker, the microwave oven, and the dishwasher. It is not an exaggeration that national economic power increasingly derives from the circuitry of the SC. Moreover, the SC is also the key to unlocking the full potential of artificial intelligence (AI), including the Siri assistant on the iPhone. Suffice it to say at this point that a country’s modern national economy depends on digital technology built on top of SCs. States that are unwilling or unable to embrace the requirements of an innovating high-technology culture will be left in the economic dust of the analog world.

It is important to understand the broader impact Apple and other tech companies like it have on its relevant economic environment. Apple’s products, by driving productivity-enhancing competitive displacement, generate untold amounts of additional economic activity. Most important, innovation-obsessed companies like Apple, in addition to achieving high levels of consumer satisfaction, generate (again through competitive displacement) an enormous amount of new economic activity. In democracies where there are open markets and robust competitive displacement, there is material economic growth, and the distribution of that growth is not a zero-sum game—there can be more wealth for virtually everyone simultaneously (so long as competitive displacement is allowed to run its course). To use the vernacular, not only is the pie getting constantly bigger, but everyone’s particular slice of the pie is growing as well. Certainly some participants in the economy earn more income, and accumulate more wealth, than others, but generally there is more inequality in autocracies, especially China, than in democracies (especially countries like Canada, Germany, and Sweden, but even the United States), including because most democracies have better access to education for their lower classes than autocracies.4

For purposes of understanding competitive displacement and its critical role in Cold War 2.0, the key lesson is that the open market operating in a democracy drives tech-induced, compounding economic power. In 1960, the size of the United States economy was $0.5 trillion; in 2022 it was $25.46 trillion,5 reflecting growth of 4,560 percent, even though population growth over that period was only 86 percent. The same happened in other countries around the world, but especially in those where technology innovation was unleashed by an open market. The size of the global economy in 1970 was $3.4 trillion; in 2022 it was $100.22 trillion.6 Not all the net new growth is attributable to innovation, but an awful lot of it is. This core economic reality is central to so many aspects of Cold War 2.0, including the simple proposition that a nation’s military muscle is largely funded by its civilian economic strength. It is because the United States has an economy of $25 trillion that it can afford a military budget of $880 billion, which dwarfs all other countries. And the US has an economy of $25 trillion because it has an open market in which technology innovation, compounded by competitive displacement, drives inexorable economic growth.

MILITARY TECHNOLOGY INNOVATION

Constant technological innovation of the kind that drives competitive displacement in the civilian sphere also propels forward steady improvement in military power. In the opening scene of Stanley Kubrick’s iconic 1968 film 2001: A Space Odyssey, some chimps learn the benefits of technology innovation, using the femur bone from a dead water buffalo to beat the life out of a chimp from another group vying for domination of a watering hole.7 Kubrick’s point? Game-changing technology makes the innovator invincible in arenas where violent confrontations get played out. Put in a more modern vernacular—don’t bring a knife to a gunfight.

There is no direct evidence that chimps were smart enough to pick up and use as clubs the femur bones of water buffalo. Human burial sites, though, are full of mass graves in locations where ancient people clearly fought battles with spears, swords, and knives. Their bones have the scars and cuts to prove the violent cause of death, and the weapons used to inflict the wounds are themselves buried in the same mass graves. Periodically, a group of humans would invent and perfect through improvement a fundamentally new technology that produced stunning results on the battlefield (the new device was, as it were, the military iPod of its day). The new weapon system allowed its inventor to competitively displace others in the contest for leadership of a region or even a continent if the new innovation was earth-shattering enough.

In the early part of the 13th century, the Mongols conquered China (and eventually much of Russia, most of Eastern Europe, and everything in between), by using a novel weapon system, an archer perched on a smallish horse.8 The Mongol horse was fast, nimble, and perfect for allowing the archer to shoot with his bow while in full gallop (utilizing yet another new innovation, the stirrup). The victims of this form of warfare tried to respond in kind, but their own lumbering equines raised in Europe simply couldn’t match the speed and dexterity of the animals the Mongols had carefully bred for generations on the steppes of Asia. This is an early example of bioengineering driving competitive displacement and paying huge dividends on the battlefield.

In a similar vein, in 1415 the English were able to defeat the French at the consequential Battle of Agincourt largely because the English had invented a “longbow,” and its use at this battle devastated the French army. The “regular” bow and arrow, invented some 60,000 years earlier, was itself a huge improvement over the spear and sword, because kinetic energy could be temporarily stored and channeled through the bow, and directed into the flight of the arrow, delivering it much farther and with greater force than a thrown spear. With a bow and arrow a serious amount of war fighting and enemy killing could be effected at some distance. The longbow then increased the distance yet again, such that the English archers at Agincourt were killing French archers well before the French archers could even get in their range and do any harm to the British.9 The British longbow was the HIMARS rocket launcher of its day, it competitively displaced the French bow and arrow and allowed the English to win the battle and ultimately to take over Normandy.

MILITARY-INDUSTRIAL CAPACITY

With the advent of the Industrial Revolution, there is a very tight fusing together of civilian technological progress with the military. What also becomes important is industrial capacity. It is one thing to invent and improve a technologically sophisticated device, machine, or process; it is another thing altogether to be able to produce it quickly on a large scale. To answer this challenge, even before the industrial age, Venice in the 1100s built its “Arsenale” in a sprawling facility of 110 acres, the largest fabrication facility in the world before the Industrial Revolution. Here, Europe’s then leading maritime power produced its standard-setting war-fighting ships. After expanding the facility in 1320, the Arsenale also built large merchant vessels (at the astounding rate of one per day, courtesy of employing 16,000 workers and hyper-standardized production processes). To this day, many shipbuilders construct civilian and naval vessels side by side in their enormous dry docks. In addition, as the Industrial Revolution hit its peak in the mid- to late 1800s there was also a material increase in “dual-use” technologies, such as the railroad, where the same technology made its mark simultaneously in the civilian and military domains.

A good example of the importance of industrial capacity in warfare from the 1800s was when Britain and France sent forces to the Black Sea to dislodge Russia from the Crimean peninsula. British and French troops laid siege to Sevastopol for eleven months, to oust the Russian Black Sea fleet from this important port and military base. The Russians ultimately lost this war because they lacked sufficient railroad connectivity to the region with which to resupply their beleaguered troops (the British and French also had superior warships and better rifles). Overall, the Crimean War from 1853 to 1856 was a disaster for the Russians because their military and civilian technologies were inferior to those of the French and British. As noted in the introduction, history never repeats itself exactly, but it does rhyme from time to time, in this case in Crimea. A similar lesson was learned by the Confederate states a decade later during the American Civil War between 1861 and 1865. While initially the South fought effectively against the Army of the North, eventually the more advanced civilian and military technologies invented, manufactured on a large scale, and effectively deployed by the North through superior logistics networks allowed President Abraham Lincoln to achieve his war aims of keeping the United States intact.

The consequences of being able to translate civilian technological progress (including effective industrial capacity) into military supremacy can be seen starkly in the “long 19th century.” In the late 1790s, China, India, and Europe each had about one-third of the world’s industrial base, in terms of factories, foundries, looms, and the like. In 1900 the split was Japan 2 percent, Russia 8 percent, and Western Europe 70 percent, with the United States the leading individual country at 20 percent. This enormous shift in economic heft then translated into geopolitical military power in the following manner: by 1913 about 80 percent of the landmass of the Earth belonged to the European powers, with the British alone holding 23 percent (and even the US getting into the act of empire when it took over the Philippines and Cuba in 1898). India and China, who both failed to adopt the latest technology from the democracies, ended up being utterly humiliated in the second half of the 1800s. This is a status and outcome they are both working hard to avoid in Cold War 2.0.

The importance of industrial military capacity, now much dependent upon high-tech processes itself, can be seen in the current Russo-Ukrainian War. Again, it is not enough to invent, or enhance, a modern weapon system or its various components; units of it also have to be produced on a large scale. In the war in Ukraine, at the peak of the Battle of Bakhmut, the Ukrainian army was firing about 6,000 artillery shells a day (not HIMARS-type rocket missiles, but “regular” unguided howitzer shells). These shells are sourced from several NATO member countries. The United States, though, Ukraine’s largest supplier of artillery, currently has only one factory capable of producing these shells. It makes about 460 a day. This number is not nearly enough to allow Ukraine to keep using the number of shells it would like to. Other NATO countries are making up the shortfall, but by early 2023 it was clear that net production of new additional shells would have to be ramped up if Ukraine was to continue its fight with effective artillery support. Cold War 2.0 planners can never forget that manufacturing physical items still matters a great deal, even if the most intriguing trends in both civilian and military innovation are on the digital front.

There is another very important dimension to military-industrial capacity. It is not enough to have built up stockpiles of military kit in peacetime. The key to a robust industrial capacity is to be able to replace systems, like jet fighters, tanks, missiles, or now drones, that are lost in combat. In peacetime ministers of defense, and often even generals, forget this crucial dimension of military-industrial capacity. It’s not a case only of what number of tanks a military starts with, but how quickly losses—both the tank and fully trained tank crews—can be replaced. This is another lesson (re-)learned from the Russo-Ukrainian War. Hopefully war planners in Taiwan are absorbing this wisdom from what they are seeing in Europe’s largest war since World War II.

The previous point in turn raises a very critical consideration about training, especially of new human crews. As weapons like the HIMARS are depleted on the battlefield of a major war, not only must new units of the weapon system be built in record time, but individuals need to be trained very quickly so the new HIMARS units (or tanks, or F-16s, let alone F-35s) can be promoted promptly to the conflict zone. Again, digital technology, with significant AI elements, running on high-performance SCs (and eventually on QCs) will be central to cutting down current training times. It is a testament to their military savviness and base knowledge that Ukrainian HIMARS crews were trained in only three months, but even this pace, eventually, with the help of technology, needs to get down to three weeks. The toughest nut to crack will be pilots, on hyper-complex machines such as the F-16.

Once the weapon or other military asset is made, there is also the question of logistics, the requirement to get the right item to the right place at the right time. As powerful as the Mongols were in their heyday, their leader Kublai Khan was unable to successfully attack Japan because the assault ships he used were river boats unsuitable for crossing the ocean waters between China (by then conquered and occupied by the Mongols) and Japan. His ships didn’t have a deep enough keel, and they had to return to port. Japan was saved, and left unconquered by the Mongols, because of a logistics deficiency on the part of the would-be attacking force. Presumably the current regime in China is reviewing this history carefully in preparing for their planned invasion of the island nation of Taiwan. It is interesting that the US military is addressing the massive challenge of logistics by leaning heavily on civilian transport companies like FedEx and United Parcel Service (UPS) to help move soldiers and matériel, especially in major conflict scenarios where large volumes need to be transported urgently. To do this well, both the civilian companies and the US military need to share ongoing access to the same logistics computer systems. This is another compelling example of the need to have seamless civilian-military fusion. Cold War 2.0 will be won by the side that can bring a technologically expert “whole-of-society” effort to the conflict.

It’s important to understand one additional nuance in the technology permeating military power currently. Offensive weapons systems prompt the creation of defensive weapons systems. With apologies to Isaac Newton, to each offensive military technology advancement there is an equal and opposite defensive military technology reaction. During ancient and medieval times castles with high walls, built at the top of high hills, were a classic example of defense responding to offense. The massive walls of Constantinople (present-day Istanbul), built in the 300s, successfully kept invaders at bay for over 1,000 years. In Malta today there are still standing some of the most extensive walled defenses in the world, dating from the 1500s, especially in Mdina and most of all in the capital, Valletta. Since World War II such walls have rarely been built for defense because aircraft and missiles can simply fly over them. Instead, currently air defense systems are being turned out in large numbers. The most advanced of these in the world, the US Patriot ground-based and Aegis naval-based ADS are a testament to defense technologies responding in kind to offensive ones. This is all part of the “action/offense-reaction/defense” cadence of modern arms races.

This offense/defense rhythm matters, and more on the military side than the civilian. While the dynamic of competitive displacement is largely technologically the same in the civilian and military domains, the consequences of failure in each sphere can be quite different. When Sony was competitively displaced in the personal music device space by Apple’s iPod, Sony still had lots of other businesses to fall back on, and to this day their gaming console is by far the leader in the marketplace. Losing the Walkman battle to Apple’s iPod was not the end of Sony. In military terms, if the Patriot and Aegis ADS technologies don’t prevail in a future Battle of the Taiwan Strait, the Taiwan democracy loses its freedom to the autocratic Chinese. The stakes for competitive displacement are simply higher in the military domain. The consequence of losing the competitive displacement contest in the civilian domain is financial; losing it in the military domain is often existential. The democracies forget this key message of Cold War 2.0 at their peril.

To recap: inventing new military technologies that give significant competitive advantage to the more innovative nation is a key means for increasing hard power relative to peer nations that don’t have equivalent technology. What history teaches, though, is that often it is not the initial invention that counts, but rather the subsequent modification, enhancement, and improvement of the machine, device, or process. This is particularly true for civilian inventions that are subsequently modified for use in military applications.

INNOVATION = INVENTION + IMPROVEMENT

Innovation consists of two activities, the initial invention of a device, and then the subsequent improvement of it. Surprisingly, improvement often proves to be more valuable than invention, in both the civilian and military domains. Most people who have only a vague sense of the Industrial Revolution will think that James Watt invented the steam engine, but that’s wrong. It was Thomas Newcomen who (in 1712) actually invented it, but what he came up with didn’t work that well. James Watt perfected it (in 1765), so that it could be used to power factories, and eventually trains, ships, tractors, and the other monumental machines of the Industrial Revolution. Invention is important, but improvement usually more so.

This principle will be important in Cold War 2.0 as the democracies and the autocracies constantly leapfrog one another’s technology. Recall that before Google search, there were Lycos, AltaVista, and about a dozen other serious search engines. Clearly improvement also plays a central role in competitive displacement in today’s technologically oriented economy. It’s been like this for some 150 years. Thomas Edison is generally thought to have “invented” the light bulb, and to this day a light bulb turning on is the very symbol of invention. In fact, though, Edison took an existing light bulb concept, and then tried 600 different metals and substances as filaments before he painstakingly improved his way to the right one.

In the military realm, gunpowder is the classic example of the importance of improvement. In the 800s the Chinese came up with the idea of mixing together saltpeter, sulfur, and charcoal to form a very combustible substance, but initially they used it by throwing the actual flaming mass at an enemy. This was a somewhat scary war-fighting tactic, but not very effective on a battlefield. The Byzantines improved the delivery of the fire in the 1200s by hurling it from a tube, but still it was a fairly exotic weapon, and not quite mainstream. The Chinese were on to something when they experimented with some early cannons in their armories, but they were very rudimentary, and still not very effective.

Gunpowder came into its own as serious ordnance, and eventually the defining weapon of its age (by the 1500s), when the West Europeans greatly improved upon the gunpowder munitions used at the time by the Chinese and the Byzantines. The Europeans exploded a small amount of gunpowder in the breech of a rifle or cannon. This controlled explosion then launched the real point of the exercise, by forcing a bullet out of the barrel of a gun or pistol, or launching a cannonball out of a cannon. These projectiles then did heavy damage to the enemy. Using gunpowder to launch a metal projectile against an enemy was a revolutionary moment in military affairs, and it changed the dynamic of national power forever.

The following comparison makes clear the importance of the innovatively repurposed gunpowder. A typical bow would generate about 80 joules of energy in an arrow in flight, while an early musket could produce in its small metal ball (shot out of its barrel) 1,500 joules of energy. A mature musket (a predecessor of the rifle) could get the joule count up to 3,000. This is why metal suits of armor could protect against arrows, but not shots coming from muskets, let alone modern rifles. It is not until the bulletproof vest of the 1890s that there is even some rudimentary defense against rifles and pistols, and really effective protection against bullets has to wait until the 1970s and the invention of Kevlar.

The fundamentally different way in which Europeans perfected the use of gunpowder allowed them to conquer by force of arms the non-European world. It also allowed the Christian naval fleet assembled by Pope Pius V to defeat the Ottoman fleet at the Battle of Lepanto in 1571. This improvement around gunpowder also ushered in the great age of West European exploration. On a more sobering note, it also fueled incessant wars between the European nations for 450 years, causing untold numbers of deaths and mass maiming. No one said that military competitive displacement would be a picnic.

LEARNING INNOVATION AND DEMOCRACY—THE CASE OF ASIA

Innovation, and especially its improvement, can be learned. Consider the respective geopolitical reactions of the Chinese and Japanese when in the mid-1800s modern warships from Europe and the United States sailed into their harbors and rivers demanding that the reclusive Asian states open up their countries to trade. These entreaties were successful because they were backed up by gunpowder-fueled cannon and rifles the likes of which the Chinese and the Japanese had never seen. Indeed, so powerful were the British that the Chinese were forced to accept opium in payment for their goods. The British cultivated the opium (from the poppy plant) in India, and eventually this nefarious British export to China caused some 90 million Chinese (30 percent of the population) to become addicted to heroin. The Chinese would not fully recover from these Western indignities until, more than 100 years later, Deng Xiaoping kicked off the amazing economic growth story of modern China. Even today, though, the current Chinese leader, Xi Jinping, talks of China’s “age of humiliation,” referring to what the British did to China in the 1800s. Ironically, China today is exacting a degree of revenge on democracies by currently supplying the two largest drug cartels in Mexico with the precursor chemicals for making illicit fentanyl, which they smuggle into the United States. These illegal drugs then cause about 100,000 American people a year to die from fentanyl overdoses. Sometimes the rhyming of history can be pretty loud.

In the 1800s, though, the Chinese response to West European predation was weak, hesitant, and supine. The Japanese, on the other hand, responded immediately and with vigor. Determined to do something meaningful about their lack of modern technology soon after the American humiliation, between 1871 and 1873 the Japanese government sent emissaries around the world to learn about the latest and best techniques for innovating, manufacturing, and deploying modern technology. They ultimately adopted an educational program based on the one used in France, they instituted an industrial production system cloned from the British and the Americans, and they created a military that mimicked the German model. In the tech world, there are clear advantages to what is called “first mover advantage,” but equally there are benefits to being the “second comer.” In the history of technology innovation, the Japanese became the second comer par excellence.

Japan’s Meiji regime’s efforts to adopt new technologies and techniques, and to apply them to military ends, proved to be a huge success in their goal to build an autocratic empire. In 1905, in an absolutely shocking development at the time, the Japanese military started and won a war with Russia. This effort included, among other maneuvers, a sneak attack on the Russian navy based in Port Arthur (it had been leased to the Russians by the Chinese) in order to kick off the war. Again, history never repeats itself exactly, but the Japanese sneak attack on Pearl Harbor some thirty-five years later certainly rhymes with this event. As in the Crimean War some fifty years before, Russia would learn through this defeat at the hands of the Japanese that technology (both specific innovations, as well as their manufacture and logistical deployment) is the key determinant of national power. Of course, a modern war-fighting doctrine, healthy morale among the troops, a strong officer corps, and other factors are necessary as well, but without the deadly technology the other ingredients would be insufficient. In effect, relative to Cold War 2.0, a country’s current military power can be assessed by looking at the stockpile of its currently state-of-the-art weapons, but its power in twenty years’ time is best predicted today by how well its brightest current high school students are learning new concepts in math, physics, biology, and chemistry.

Using European and American industrial technologies, Japan became the regional hegemon power in Asia after the Russo-Japanese war, and it was only Japan’s defeat in World War II that finally cleaved off their colonial lands. Soon after World War II the Japanese resumed their economic industriousness, but this time (with much coaching from the Americans) they made a deep commitment to democracy as their political system. By the 1960s they were an economic powerhouse. In 1964 Tokyo hosted the Olympic Games and showcased to the world a modern, peaceful nation firmly embedded in the global community of democracies. By the 1970s Japan was even seemingly threatening American economic dominance; Japanese cars were competitively displacing American ones, Japanese televisions were competitively displacing American ones, and so on across many different industrial and consumer goods. In effect, the Japanese had learned that unlike what they did before World War II, they didn’t need to fight their way into foreign markets with guns blazing. For many decades they had a standing army of only 100,000 soldiers, while today’s figure hovers around 275,000, due largely to the tense situation in Asia caused by Cold War 2.0. The Japanese have learned that to secure economic prosperity it is sufficient to produce well-made products, given that the rules-based international order (especially its trading rules and regulations) allowed them to sell these goods all around the world.

South Korea and Taiwan, on a lesser scale, replicated Japan’s success. South Korea in particular suffered tremendous trauma and destruction during its time as a colony of Japan (1910 to 1945) and during the Korean War (1950 to 1953). As a result, South Korea’s rise up the industrial league tables starting in the late 1950s and continuing to the present is nothing short of spectacular. In the context of Cold War 2.0, the massive success of South Korea, with its democracy and relatively open market economy, can be contrasted with the abysmal living conditions of North Korea (and repeated famines) resulting from the “hermit kingdom” clinging to a dual autocratic system of politics and economics. Interestingly, the superior performance of West Germany over East Germany between 1945 and 1989 proves the same point; in double, side-by-side growth real-world experiments, the democracy far surpasses the autocracy on every metric imaginable.

For their part, Taiwan’s muscular entry into the production of semiconductor chips in the 1980s—and their competitive displacement of American SCs by the 1990s—was gutsy and brilliant in the extreme. (South Korea also entered the SC production market at this point, choosing to focus on memory SCs.) Then, to have both South Korea and Taiwan, like Japan, commit to democracy since the 1980s is a wonderful testament to the patience, maturity, and judgment of their leaders and their citizens over the past five decades. All three countries resolutely refused to believe the propaganda coming out of Beijing and Pyongyang (North Korea) that Asian people were predisposed to autocracy and not at all suited to democracy. Japan, South Korea, and Taiwan have bought into the rules-based international order, and they and their people have benefited greatly as a result.

China would do well to learn lessons from these three Asian exemplars of economic competitive displacement and political democracy as Cold War 2.0 unfolds. China is currently following the path taken by Japan in the 1885–1945 era, when Japan devoted all its effort to becoming the imperial hegemon of Asia by force of arms. China’s behavior over the last twenty years—snuffing out personal liberty in Hong Kong, conducting a massive buildup of the PLA (its armed forces and nuclear weapon stockpile), cravenly building islands and military bases across the South China Sea, oppressing its people (especially the Uighurs) through sophisticated surveillance and control technologies, and tightening Xi Jinping’s autocratic hold on politics—rhymes dangerously with Japan’s pre–World War II behavior. Rather than emulate this version of Japan, which ended in disaster (for Japan, the region, and, ironically, China as well), China should instead consider the path blazed by Japan after World War II, where China would truly commit to the rules-based international order and become a responsible, valued, and prosperous leading member of the international community. Put another way, the supreme irony of Asian geopolitics today is that China wants to absorb Taiwan to be able to make Taiwan more like China, but in fact what should happen is that China should become more like Taiwan. That’s a tall order, but wonders never cease. There is also that ancient Chinese proverb, typically meant as a warning, but in this case useful as an admonition to China: “You become what you oppose.”

HYPER-INNOVATION

Starting in the late 1800s, the industrialized countries of Western Europe and North America, and increasingly including Japan as well, experienced hyper progress in science and technology innovation. Breakthroughs in multiple scientific disciplines and technology domains were achieved with accelerating frequency. In medicine, in 1888 Canadian born and educated William Osler moved to Baltimore to lead the Johns Hopkins Hospital. In 1892 he wrote The Principles and Practice of Medicine, the last book intended to cover the entire spectrum of modern medicine. The explosion in medical knowledge and technique that followed simply couldn’t be contained in a single volume thereafter, as the age of specialization in medicine was beginning. (Today, for example, at the University Health Network in downtown Toronto, one of the top five medical centers in the world,10 just the surgery department has fourteen subspecialties!)

Around the same time, the chemical industry (particularly in Germany) made leaps and bounds in the thirty years leading up to World War I (between 1914 to 1918). The result was significant advances in the biological sciences, such as the invention of nitrogen-based fertilizer. At the same time, though, the Germans and the allies used chemical warfare agents, such as mustard gas, in the battlefields of northern France and Belgium during World War I, illustrating again the potentially nefarious “dual-use” nature that lurks within virtually all technological innovation. Tens of thousands of soldiers in the Great War were killed, poisoned, or made blind as a result of gas attacks.

Around this time the steel industry got a huge boost from the invention of the new Bessemer steel-making process. Skeletons of steel beams drove buildings ever taller, reaching twenty-five stories in some American cities, especially in Chicago and New York, by the outbreak of World War I. The invention of the elevator by American Elisha Otis facilitated the verticalization of urban life. In yet another example of dual-use technology, the new type of steel used for the frames of tall buildings was also used to build submarines, which were planned as a revolutionary threat to, and deadly competitive displacer of, navies and merchant fleets comprised only of surface ships. War fighting on land became more mobile with the invention of the tank. World War I tanks were derived from the civilian automobile invented in the late 1800s but were outfitted with steel armor. To complete the new, innovative, dual-use deadly trio, the Wright Brothers in 1903 invented the airplane. They flew a flimsy glider equipped with a small engine only thirty meters and just a couple of meters off the ground. A dozen years later, the much-improved airplane would be making reconnaissance missions over the battlefield and sometimes even dropping small bombs on enemy soldiers.

As noted earlier, when it comes to innovation, improvement will almost always outshine invention. The three tentative war machine inventions of World War I, through a constant and successful program of improvement, eventually became full-scale, industrial strength monsters of destruction by World War II (between 1939 and 1945). The submarine, which was a fairly rudimentary device in World War I, became a sleek, powerful menace some twenty-five years later. German U-boats sank a total of 2,825 allied merchant and 175 allied naval vessels, and almost brought England to its knees in the early stage of the war, when Britain stood virtually alone against the Nazis. British prime minister Winston Churchill would comment that of all the inventions of the war, the one he really feared was the submarine.

As for the airplane, the small handmade bombs thrown from World War I biplanes made of wood, fabric, and wires were superseded by gargantuan four-engine bombers made of aluminum and glass, purpose-built to deliver a huge tonnage of high explosive bombs on each flight over enemy territory. Breathtaking technology innovation brought total war, as bombers dropped their deadly ordnance on civilians in cities. Of the 70 million dead in World War II, about 50 million were civilians. As for the tank, it became the bedrock weapon system for all land armies, and it was key for delivering early German victories in Poland and France. German tanks almost managed to defeat their Russian counterparts on the steppes of Russia, but that became an objective too far. It helped enormously that Stalin cannily made sure Russian tank factories were all located east of the Ural Mountains, so that they could continue to build tanks while the main fighting with the Germans was going on west of the Urals—once again highlighting the crucial importance of military-industrial capacity.

The development histories of the submarine, the airplane, and the tank during the last 125 years drive home the importance of improvement in the innovation equation. These three weapons systems are still core to militaries around the world. They will continue to be critical weapons systems for the coming decades of Cold War 2.0. They will undergo, though, massive further modifications during those years, in particular incorporating more and more AI and high-end SCs into their mechanical systems. They will also use QCs to assist with complex targeting calculations. In Cold War 2.0 lots of new digital wine will be poured into old bottles when it comes to weapons development.

During World War II, scientists, engineers, and technologists pursued the most difficult science project undertaken by mankind to that point in history, namely the design, development, and deployment of the atomic bomb. In the 1930s theoretical physicists calculated that certain types of atoms could be manipulated in a way so as to release great amounts of energy, but bridging the gap between scientific theory and applied engineering was no simple feat. Nevertheless, overcoming great technological adversity under intense time pressure, in the New Mexico desert at Los Alamos, American scientists (ably assisted by colleagues from Britain and Canada) spent over $2.2 billion (a whopping $37 billion in today’s dollars) on making the first atomic bomb. At its peak, the project employed thousands of people. In 1945, the Americans dropped two atomic bombs on Japanese cities in order to avoid even more numerous American (and Japanese) casualty figures had the US instead pursued amphibious assaults on the remaining Japanese islands.

Releasing the power of the atom is the epitome of the dual-use innovation. Within a few years after the war the Americans built the world’s first civilian nuclear power plant, followed by Britain, Canada, France, and a few others. Nuclear isotopes also came to be used in medicine, particularly in cancer therapy. In a similar vein, the massive bomber planes of World War II gave birth to a commercial aircraft industry after the war, first powered by propeller engines, but improved to use jet engines by the 1960s. A huge new industry was created once the hundreds of new airports and air cargo facilities were factored into the economic equation. This is one of the many new industries that has contributed to the massive global economic growth over the past eighty years, referred to earlier in this chapter.

Meanwhile, further up in the atmosphere the Russians were the first to launch a small satellite, called Sputnik, in 1957. The democracies were stunned, as autocratic communist regimes weren’t supposed to do innovation very well. Generally, that was true, and continues to be true today, but it turns out that an autocratic regime could pour huge sums of money into a focused technological effort, and eventually large quantity takes on a quality of its own. There was more shock and awe when the Soviet Russians were the first to send a human (Yuri Gagarin) into orbit in 1961. The Americans countered by creating NASA and launching an ambitious program to put a man on the moon by 1970. At its peak, the US space program was costing annually an amount equal to 4 percent of the entire US GDP. During that period the US was also fighting a major (and very expensive) war in Vietnam. Nevertheless, America’s juggernaut economy was sufficiently buoyant throughout the 1960s that the world’s leading democracy could afford (just barely) these two massive expenditures, right up until the Apollo mission put men on the moon in 1969 and for several years following. Ironically, it was Soviet Russia that ran out of money and decided to abandon its plan to put humans on the moon. Later, their entire autocratic economy fell apart, causing a political collapse as well.

INNOVATIVE DIGITAL WEAPONS

The Apollo moon mission made extensive use of computers. Indeed, no other innovation exemplified American technological supremacy in the 1960s and for decades afterward (and to this day) than the computer industry. Early ancestors of the computer were built during World War II to crack the encrypted code used by the Germans to scramble their long-distance secret army messages. These first computers used vacuum tubes, not that different from the ones in the first radios in the 1920s and television sets in the 1930s. By the late 1950s/early 1960s, though, a new technology was invented in the research labs of several American companies (primarily Bell Labs, Shockley Semiconductor, and Fairchild Semiconductor) using a solid-state semiconductor material that performed better than a vacuum tube, used much less power, and took up a fraction of the space. In 1961, the digital computer based on the semiconductor chip was born in America. It had only four transistors. Today, some sixty years later (in the span of a single lifetime), the most powerful SC is much smaller (the size of a fingernail) and has about 50 billion transistors. Hyper-innovation indeed!

The computer, its various components (especially the SC), and software have insinuated their way into virtually every modern device on earth that uses electronics. Computers are another exemplar of dual-use technology, to the point where modern weapons systems, like submarines, missiles, fighter aircraft, surface naval vessels, tanks, and drones, are really just computers that deliver an explosive warhead. Military smart bombs use the same GPS system as civilian smartphones. The computer-based “fly-by-wire” controls on a helicopter used in the army look and perform an awful lot like the digital control system used in a helicopter operated by a mining company in the remote outback somewhere. The world has reached “Cold War 2.0 singularity,” namely that civilian and military innovation are essentially one and the same.

The steady computerization of the military and their weapons systems has a number of important ramifications. Just as in the civilian tech sector, there are now constant updates required for the software in weapons systems. Smartphones are updated once every 12–16 months, and so it is with smart bombs, missiles, radar, Patriot, Aegis, etc. This adds material costs to the procurement and sustainment of modern, digitally enhanced weapons systems. It also impacts ongoing training requirements. If a user cannot figure out the latest features on their smartphone, there is no harm done. If a HIMARS crew doesn’t learn thoroughly its latest software release, they could put themselves and their mission in jeopardy. Training on digital military systems is not optional. Indeed, during Cold War 2.0 lifelong technical learning for human users has come to the military before it has fully taken hold in the civilian economy.

Computerization of weapons also leads to increased lethality. This trend is not new. Several hundred thousand years ago the first weapons wielded by humans were simple clubs. Then, as humans developed metallurgy, axes and knives were invented, as were bronze (and later iron) swords. The axe is another thoroughly dual-use weapon, equally effective for cutting down a tree limb as cutting off a person’s limb. These weapons required “hand-to-hand” combat, though, so the radius of human harm was greatly limited. As noted above, the bow and arrow represented a step level escalation in lethality. While thrusting weapons relied solely on the muscles of the human wielding the club, axe, knife, or sword, the archer also harnessed the kinetic energy created in a flexed bow or stored through the windup of a crossbow. Gunpowder-based weapons increased lethality by yet another order of magnitude, by taking advantage of chemistry—and nuclear weapons by harnessing physics.

Digital weapons add to lethality by achieving greater precision. Artillery of the 1500s had a range of several hundred meters and cannonballs rarely actually hit the intended target, so lots of volleys had to be fired. By World War I, the general range of artillery was 8–14 kilometers, but the targeting was still largely driven by happenstance and gravity. Today’s HIMARS and similar systems are an altogether different proposition. These weapons can deliver their payload over a distance of 50–80 miles, with the warhead landing within several feet of the intended target, thanks to clever digitally-based targeting and guidance features that use GPS systems, the same service that runs “maps” on a smartphone. Then there are numerous guided missiles (such as “cruise missiles”), first widely used in the Gulf War of 1991, where they constituted only 20 percent of the munitions used by the allied forces but inflicted 75 percent of the damage. They are more lethal, but at the same time they can better avoid civilian casualties. These are the new paradoxes generated by innovative digital weapons in Cold War 2.0.

THE FINANCIALS OF INNOVATION

The full impact of the English longbow wasn’t merely its superior functionality. It’s indirect benefit, part of the secret sauce of competitive displacement from the “supply side,” was that an archer with a longbow was much cheaper to put on the battlefield than a medieval knight, the previous dominant weapon system. A knight was very expensive to outfit, and there were only a limited number of them, given the strict rules of noble inheritance at the time. Archers, on the other hand, were a fraction of the cost of a knight to equip, billet, and board. The English king (like Henry V at Agincourt) who switched out knights in favor of longbow archers could decide to retain many more of them, or simply keep the savings, depending on how they viewed the enemy. In effect, the financial parameters of military power are important and always have been. Kings at first, and then prime ministers and presidents in our own day, worry seriously about the cost of battlefield technology because they cannot ignore budget deficits and government debt. This will never change, regardless of how much surplus wealth the civilian economy is able to produce through the Cold War 2.0 period.

This is actually a weakness of democracies relative to autocracies, especially as the intensity of Cold War 2.0 ramps up in places like Ukraine and Taiwan. The cost of a country’s military can be quite controversial, particularly in democracies where the budgeting process is open and subject to debate and approval by elected representatives. There was a time in the 1800s when defense-related expenditures represented 90 percent, or even more, of a government’s budget. This is somewhat misleading because governments in the early 1800s didn’t spend anything on (nonexistent) social programs. Therefore, a king, president, or prime minister who went to the legislature for an increase in government funds (effected through a raise in taxes) generally did so in order to wage war somewhere.

Since World War II, with the steady decline in the number of wars around the world and the massive increase in popular demands on governments in democracies to fund social programs (like healthcare, education, and unemployment benefits), spending on defense in virtually all democracies has declined dramatically as a percentage of the entire public budget. Since Putin’s invasion of Ukraine in 2014, there has been plenty of pressure on the democracies to spend more on defense. Since Putin’s full-scale invasion of Ukraine in 2022 the pressure to spend on defense is intense indeed. Still, the social priorities for spending haven’t gone away. Huge deficit financing to pay for all the social and the defense demands is usually the answer, meaning that some unlucky future generation will have to pay the full bill when it comes due.

Autocracies are less buffeted by this particular budgetary dynamic than the democracies. The autocrat is able to direct the funds of the national budget somewhat at will, while a leader in a democracy cannot do this. The autocracy, though, suffers from the scourge of corruption that has a deleterious effect similar to chronic underfunding of the military. When Russian armies invaded Ukraine in February 2022, many tanks, armored personnel carriers, and other vehicles (such as supply trucks) broke down because they had been poorly maintained in the years and months leading up to the invasion. Tires on trucks, for instance, simply fell apart because they were much too old and worn out, and should have been replaced months if not years before. They weren’t replaced or maintained properly because corrupt senior military officers skimmed off the money for themselves that should have gone to truck tire maintenance. At least when a military in a democracy is denied funding the money goes to another government department (to educate children or to heal the sick) and not into the illegal bank accounts of army officers. This weakness caused by corruption in the autocracies will pay dividends to the democracies throughout Cold War 2.0.

On the other hand, the democracies (and the autocracies) suffer from another financial burden—simply the incredibly high cost of modern high-tech kit. Precision-guided munitions are very effective, but their cost is also very significant. A single cruise missile will run in the range of $2 million. Then there is the really expensive military weapon system. A single F-35 fifth-generation fighter—just one of them—costs $85 million. That’s just to buy it. To run and maintain it, the entire all-in cost for fifteen years is about another $130 million. And that’s for just one plane. One. Surely the government has to start asking if an F-35 really is ten times better than a very sophisticated $20 million drone? Is the drone the modern equivalent of a longbow, and the F-35 the medieval knight? Moreover, can reusable drones firing Hellfire missiles (costing “only” $150,000 each) replace cruise missiles ($2 million a piece) in most operations requiring firepower? Bottom line: Should commanders in the field be thinking about cost when they are taking fire in the middle of the fog, pressure, and high-stakes risk of war?

Another question that has to be asked regularly is: Where can savings in the military budget come from? A few years ago, when the United States contemplated upgrading its “triad” system of nuclear defense, the US government ultimately decided that US nuclear deterrence still required a three-legged stool: air force bombers carrying nuclear-tipped cruise missiles, the navy’s submarines carrying nuclear ballistic missiles, and nuclear missiles on army truck launchers roaming around the Midwest US to avoid being targeted by enemy first-strike missiles. Renewing the entire triad is going to cost a whopping $1.5 trillion over the next thirty years. Had the US dropped one leg of the triad, they could have saved roughly a third of that cost, so about $500 billion. That would make a lot of sense, and wouldn’t appreciably increase the risk to military preparedness, but no service is willing to step up and say that it would take the reduction of responsibility (and funding). It is spending and strategic mistakes such as these that put the US defense budget (and the overall American federal government budget) in jeopardy. Cold War 2.0 is playing out in a time when the US, and most other democracies, are experiencing unprecedented levels of public borrowing. The democracies must be smart about military expenditures. The alternative is a devastating crash against a fiscal wall when financial markets finally start to refuse to buy American treasury bonds. When that day comes the autocrats will have won Cold War 2.0.

CONSTRAINTS ON MODERN WEAPONS

One answer to the concern about the high cost of weapons today can be answered, at least partly, by the general aphorism “you get what you pay for.” From this perspective, one major benefit of digitally targeted munitions is that they cause far less civilian and other collateral damage, or at least they can be used with this goal in mind. This is key for leaders of democracies, particularly when their militaries are involved in operations that will generate all variety of images and videos that will be posted on social media. In these circumstances, because leaders of the democracy will have to face voters in the upcoming election, limiting the lethality of weapons solely to the armed forces of the opponent is critical.

Sadly, autocrats typically want to achieve quite the opposite. In Ukraine, commencing in October 2022, the Russian military directed hundreds of missile and drone attacks specifically against civilian infrastructure (such as power plants and other utilities) and civilians themselves (such as targeting an urban marketplace or a residential apartment block) to try to break the spirit of the Ukrainian people, so that they would apply pressure on the leadership of their democracy to sue for peace on terms favorable to the Russian invaders. In so doing, the Russians committed hundreds of war crimes as targeting civilians in this way is clearly prohibited by several Geneva Conventions. This heinous strategy of the Russian autocrat Putin was not an oversight. In democracies the commission of a specific war crime is typically a bug in their military culture (and something to be investigated and dealt with responsibly by legal process). In stark contrast, for an autocracy the commission of war crimes is a central feature of their system of war fighting. Relative to Russia’s unjustified war in Ukraine (itself a war crime), Putin has passed a law in Russia giving carte blanche advance immunity from Russian prosecution to any soldier that commits a war crime in Ukraine, so long as it was done to further the goals of the Russian army. This is a trenchant example of the vast moral and legal chasm between the autocracies and the democracies, and why a Cold War 2.0 exists at all.

Digital features in modern weapons systems have also increased the tempo of military activity, particularly in respect of defensive operations. Enemy missiles, drones, and modern fighter-bomber aircraft can fly at very high speeds (with hypersonic missiles at five times the speed of sound, about 6,200 kilometers an hour) and can be over urban or strategic targets in a matter of minutes after entering the opponent’s airspace. In order to defeat these attacks, the defender must quickly gather all the relevant sensor information about what is coming at it and exactly where and at what speed, and then countermeasures must be launched virtually immediately, either by scrambling fighters or launching antiaircraft missiles to intercept the incoming enemy ones. Individual humans cannot carry out this function well, or not at all, if there are more than two or three missiles coming at them, as they have difficulty processing all the relevant information in the short time allotted (see the scenario at the beginning of the introduction). As with so many civilian technologies in modern life, current military technology moves at a pace much faster than humans; when it comes to air defense, humans will participate less and less in it as Cold War 2.0 progresses.

Not surprisingly, therefore, in antiair and antimissile defense, computers, software, and especially AI have been pressed into use to help with collecting the relevant information, processing it, and then, critically, launching the countermeasures/intercept missiles—and often automatically because there is simply insufficient time for humans to insert themselves into the workflow required to respond to the enemy threat. This heralds a new age of digital warfare, one that raises a number of important questions for both democracies and autocracies alike. Most of these questions center around what degree of human oversight should be retained over these digital systems. Hyper-innovation in military affairs will present thorny ethical and moral conundrums in Cold War 2.0.

In some respects, this is not an entirely new dilemma. For about 120 years humans have developed weapons that a large number of countries, both democratic and autocratic, have agreed are so murderous that they must be banned, or at least very tightly constrained. In World War I, both sides used chemical weapons on the battlefield by lobbing canisters of chlorine and mustard gases into the opponent’s trenches, and even across no-man’s land between the opposing forces. The resulting horrendous deaths caused revulsion among even the toughest generals, and in 1925, thirty-eight countries (today 146 countries) agreed to ban the first use of such chemical weapons in warfare (but countries could develop them and use them to retaliate against another country that used them first).

In 1972, signatory countries agreed to add biological agents to the list of banned substances under the treaty, and in 1993 further restrictions on chemical and biological weapons were added.11 An important impetus for this set of international conventions was the practical issue that these chemical and biological agents, at the time, could not be effectively used without the gas or biological agents coming back onto the troops of the country that launched them, due to unpredictable wind patterns. Today this impediment has been solved with substances like anthrax, where the country using the chemical agent could inoculate its own troops against it, thereby opening up more opportunities for its use. This sort of dangerous dilemma will become more common in Cold War 2.0 as humans conclude (sometimes erroneously) that they have sufficiently mastered the relevant technology.

The other shortcoming with chemical and biological weapons conventions has been the reluctance of countries, mainly the autocrats but some democracies as well, to allow the United Nations to conduct effective inspections of sites to verify that the rules are being followed. Without a supervisory system that all parties agree to adhere to, confidence—and success—in the overall effort to control these gruesome substances will be low. Here surely the technologies of the 21st century can help solve such problems, including with the deployment of visual and other sensors in the relevant labs and factories, and AI to help track all inputs being consumed by the global military-industrial complex.

CIVILIAN-MILITARY INNOVATION FUSION

Nations have reached the point, in the third decade of the 21st century, where economic preeminence and military expertise are both extremely reliant on the digital innovation capabilities of each specific country. Computer engineers, data scientists, software programmers, and their colleagues in the digital supply chains are designing, developing, and deploying the modern systems that bring added digital value to all workflows in the economy. Then their digital cousins perform the same work at the defense contractor companies and within the armed forces in an effort to build weapons systems designed to deter any aggressors, or if deterrence fails, to be able to soundly defeat them. One immediate concern raised by this reality is that the democracies will have severe shortages of these skill sets with both civilian and military employers drawing from the same talent pool in Cold War 2.0. (A cursory survey of job postings on LinkedIn clearly exhibits the high demand for these skill sets in both the civilian and military domains.) One advantage, though, that the democracies will enjoy in this talent war will be that students graduating from universities in various nonaligned countries will be more inclined to emigrate to democracies than autocracies, which will help alleviate somewhat tight labor markets in democracies for skills much in demand in a Cold War 2.0 world.

The complete fusion of the civilian and the military when it comes to modern technology is well illustrated by the stunning result achieved by Ukraine in eighteen months, facing the seemingly insurmountable odds presented by Russia’s full-scale invasion of that country in February 2022. Suffice it to say that Russia had an army eight times larger than Ukraine’s, and Russia’s defense spending is ten times that of Ukraine. When Russia launched its multiple army groups across the Ukrainian border on February 24, 2022, literally no one gave Ukraine any chance of holding out for more than a couple of weeks. The United States immediately offered transport to Ukrainian president Zelenskyy so he could be flown to safety somewhere in Europe, where presumably he could try to rally an insurgency from some remote location. Zelenskyy refused “the ride,” asking for weapons instead.

At every turn in the violent conflict, the Ukrainians have proven more technologically capable than the Russian invaders. Their improvised use of modified drones to achieve multiple objectives on the battlefield are now the stuff of legend. It’s no coincidence that prior to the full-scale war some 300,000 Ukrainians worked in civilian information technology businesses, often servicing foreign tech clients in Europe or North America. In 2021 Ukraine’s 5,000 tech companies exported some $6.8 billion in IT services, comprising 37 percent of its total exports. The value of this significant talent base has become manifest in the war against the Russians. The Ukrainians were able to take the modern weapons donated to them by forty-five democracies, promptly learn their high-tech features, and in very little time begin to push back the Russian enemy with them. Six months into the full-scale war, Ukraine was throwing back the Russians in two important sectors of the fighting, with the Ukrainians taking back thousands of acres of land that just months before was Russian occupied territory. From the vantage point of 2023 it is too early to tell how this war will end, but so far it can be concluded that Ukraine has been a cauldron of military innovation.

The Russo-Ukrainian War will have a wider resonance for Cold War 2.0 as well. Asia, and particularly China, is watching the European conflict closely, especially vis-à-vis Taiwan, and learning the lesson that if Taiwan hunkers down and presents a credible “hedgehog defense,” China will have to pay an enormous cost if they wish to take Taiwan by force. Equally, though, the democracies are learning, much to the horror of their citizens, that ruthless autocrats are amenable to paying enormous costs for military conquest, especially when the currency they pay in is young men from rural districts, ethnic minorities, and prison populations. At the same time, Taiwan’s hundreds of thousands of tech workers are also, nervously, absorbing important lessons about innovation, technology, and national power from the war in Ukraine. It will be interesting to see if young Taiwanese have increased their job searches in the US and Europe since the advent of the full-scale Russian invasion of Ukraine.

The foregoing analysis about technology and national economic and military power teaches a few prime lessons for the democracies relative to Cold War 2.0. Keep markets open and let competitive displacement work its magic. Regulation of particularly sensitive or dangerous innovation is fine, just not too early so that the regulation doesn’t stifle innovation or overly benefit the larger companies in the technology domain. Another important, perhaps critical, insight: autocracies have great difficulty allowing competitive displacement to operate in the economies and technology domains of their countries because the autocrat and his enablers are too deeply invested in the incumbent monopolies and oligopolies that run the economy. As a result, technological innovation will never be as robust and vigorous in the autocracies as in the democracies. This gives huge advantage to the democracies in contesting the autocracies in Cold War 2.0, so long as the democracies don’t waste this upper hand.

Principal advice to the democracies: don’t fritter away the technological advantage you have courtesy of competitive displacement by allowing the autocrats among the democracies to come to power. Put another way, the autocracies will never be sufficiently technologically capable to be able to shoot down the democracies, so the democracies should avoid shooting themselves in the foot.

The other advice to the democracies: don’t give the autocracies the weapons they can use to shoot down the democracies. Put in the vernacular: don’t give the autocracies the rope they then use to hang the democracies. Therefore, the democracies must implement an effective program to stop the flow of advanced technologies to the autocracies, as this will stunt their economic prospects fairly materially, reducing their growth by 9 percent over ten years.12 Given the degree of civilian-military fusion, this will also handicap the efforts of the autocracies in modernizing their weapons systems, and therefore make China, in particular, less confident about being able to prevail in an attack on Taiwan. The sanctions program around high-end SCs launched by the Biden administration in October 2022 therefore makes a lot of sense, but it is only a start. This should be followed by similar initiatives in other technology domains, including in AI, QCs, and biotechnology. And it is not enough simply to hope that the rest of the world will comply with these sanctions. The democracies must become much more muscular in enforcing these sanctions regimes.