Refresh

This website african.business/2024/05/technology-information/from-nuclear-missiles-to-ai-a-history-of-the-computer-chip is currently offline. Cloudflare's Always Online™ shows a snapshot of this web page from the Internet Archive's Wayback Machine. To check for the live version, click Refresh.

From nuclear missiles to AI: a history of the computer chip

Computer chips, the core of new consumer products from trucks to toasters, had their origins in weaponry. Now their makers have reached the top of the corporate stack, writes Stephen Williams.

By

Image : Mandel NGAN/AFP

The microprocessor has become the world’s most critical resource. Chris Miller tells the story of how virtually everything in the modern world runs on chips: cars, mobile phones, stock markets, even healthcare and power grids all rely on the computer and microprocessors.

Yet while the microprocessor is today central to the modern world, underpinning much of what consumers desire, the early days of microchip technology were driven by the demands of Cold War rocketry and weapons systems and the US government’s
determination to win the space race. For example, in late 1962 the US Air Force wanted a computer to guide its Minuteman II intercontinental ballistic missiles (ICBMs). The Minuteman I missile’s computer was so heavy that it could barely meet its objective – to guide a nuclear warhead payload to reach targets in the USSR and deliver awful destructive power. Pat Haggerty of Texas Instruments promised the US Air Force that his company could deliver integrated circuits that would perform twice the computations of existing guidance systems at half their weight. As Miller comments: “No one had a larger budget to buy technology than the Pentagon.”

But it would not be too long before another US institution would also be spending big to meet the challenge of the USSR. That institution was the National Aeronautics and Space Administration (NASA), which led the US government’s race into
space but also had military objectives. It is worth noting that the computing power of the guidance computers that NASA’s 1969 Apollo capsule and moon lander relied on was far, far less than what today’s pocket-sized smartphone holds: each had just 74 kilobytes of read-only memory (ROM) to store its programs and 4kb of random-access memory (RAM) in which they ran. A low-end smartphone now has 8 gigabytes of RAM – two million times as much. This illustrates the working of Moore’s Law. Since Gordon Moore, the co-founder of chip-makers Fairchild Semiconductor and Intel, proposed this in 1965 the number of components in a computer chip has grown at an exponential pace, doubling every two years.

Soviet space shocker

It would be difficult to overstate the shock that the Soviet Union’s launch of its Sputnik satellite in 1957 caused in the West, nor what upset its success in sending its cosmonaut Yuri Gagarin into space on 12 April 1961 caused in the US President John F. Kennedy pledged to Congress on 25 May 1961 that the US would land a man on the moon before the end of the decade.

Determined to close the gap in this space race, the US government poured money into the electronics industry. The Soviet response was to try to outpace the US. President Nikita Khrushchev had been persuaded in 1958 to fund a new “city of science” (Zelenograd, literally “green city”) that could produce semiconductors.

But the Soviet attempt to replicate US chips was almost inevitably doomed to fail. Simply copying US components meant that they were always playing catch-up behind the Americans. Miller comments: “The Soviet Union’s rockets were as powerful as ever. It had the world’s largest nuclear arsenal. But its semiconductor production couldn’t keep up, its computers fell behind, its communications and surveillance technologies lagged and the military consequences were disastrous.”

China opts out

Nor did the Soviet Union’s “little brother” China have much time for developing a semiconductor industry. In fact, Miller tells us, Chairman Mao wasn’t simply sceptical that consumer goods had a place in the socialist utopia of his vision: he downplayed the importance of electronics in general. While China descended into revolutionary chaos, Intel had invented microprocessors while Japan had secured much of the global market in DRAM (dynamic random-access memory) chips.
Little wonder that a 1979 study reported that China had hardly any commercially viable semiconductor production and only
1,500 computers in the entire country. Asia’s electronic revolution passed China by while neighbouring countries such as Hong Kong, Taiwan, Malaysia, South Korea and Singapore hosted production lines employing thousands of workers (often ethnic Chinese) to manufacture huge quantities of semiconductors, mainly for US parent companies. Today, says Miller, East Asia produces 90% of all memory chips, 75% of all processor (logic) chips and 80% of all silicon wafers. The Covid
pandemic’s impact in the early 2020s on the chip industry, particularly the shortage of microprocessors that are built into most modern cars, highlighted how disruptions in the chip supply chain can lead to severe disturbances in other economic sectors.

Dire straits

There is another disruption that could impact the supply of electronic components that are central to modern life. That is the
possibility that China might decide to play out its long-promised invasion of Taiwan to reunite with the island that it has always considered part of China’s territory.Taiwan produces 41% of all processor chips worldwide and more than 90% of
the most advanced chips.

By 2022 China’s share of total semiconductor manufacturing capacity had grown to just 7%, according to Statista. For now, it is the real threat of an invasion of Taiwan that most exercises the US in the chip war, though since Miller wrote
it has moved to curtail China’s access to chip-making equipment.

Boom time

Worldwide, today’s reality is that the microprocessor industry is booming. Note that, according to figures released earlier
this year by chip-maker Nvidia Corporation, it has become the world’s most valuable company. Nvidia was founded in
Santa Clara, California, in 1993 to produce graphics processing units (GPUs). Initially used for video gaming, GPUs turned out to be ideal for fast machinelearning (“artificial intelligence” or AI) systems. With a current market capitalisation of more than $2.1 trillion – up considerably since Miller wrote – Nvidia has overtaken better-known corporate software technology behemoths such as Microsoft and Google.

Want to continue reading? Subscribe today.

You've read all your free articles for this month! Subscribe now to enjoy full access to our content.

Digital Monthly

£8.00 / month

Receive full unlimited access to our articles, opinions, podcasts and more.

Digital Yearly

£70.00 / year

Our best value offer - save £26 and gain access to all of our digital content for an entire year!

Stephen Williams

Stephen is a freelance journalist based in London who specialises in African affairs.