The Guardian: Rise of The Invisible Computer

Posted January 30, 2017
Share To
 
 

Moore's Law is invoked a lot when discussing computer processing speed and size.  Basically Moore's Law says that every year microprocessors' speed will double while the size of the chips will cut in half.  It turns out that for every year since Moore made this prediction in 1965, it has remained basically true. The question remains though, will this ever stop?

This progression in chip size and speed has had drastic impacts on our everyday lives leading to devices such as smartphones - which we talk a lot about here at TheVJ.com - and the Internet.

 of The Guardian takes a fascinating look at what happens when computers get so small they are invisible:

In 1971, Intel, then an obscure firm in what would only later come to be known as Silicon Valley, released a chip called the 4004. It was the world’s first commercially available microprocessor, which meant it sported all the electronic circuits necessary for advanced number-crunching in a single, tiny package. It was a marvel of its time, built from 2,300 tiny transistors, each around 10,000 nanometres (or billionths of a metre) across – about the size of a red blood cell. A transistor is an electronic switch that, by flipping between “on” and “off”, provides a physical representation of the 1s and 0s that are the fundamental particles of information.

In 2015 Intel, by then the world’s leading chipmaker, with revenues of more than $55bn that year, released its Skylake chips. The firm no longer publishes exact numbers, but the best guess is that they have about 1.5bn–2 bn transistors apiece. Spaced 14 nanometres apart, each is so tiny as to be literally invisible, for they are more than an order of magnitude smaller than the wavelengths of light that humans use to see.

Everyone knows that modern computers are better than old ones. But it is hard to convey just how much better, for no other consumer technology has improved at anything approaching a similar pace. The standard analogy is with cars: if the car from 1971 had improved at the same rate as computer chips, then by 2015 new models would have had top speeds of about 420 million miles per hour. That is roughly two-thirds the speed of light, or fast enough to drive round the world in less than a fifth of a second. If that is still too slow, then before the end of 2017 models that can go twice as fast again will begin arriving in showrooms.

This blistering progress is a consequence of an observation first made in 1965 by one of Intel’s founders, Gordon Moore. Moore noted that the number of components that could be crammed onto an integrated circuit was doubling every year. Later amended to every two years, “Moore’s law” has become a self-fulfilling prophecy that sets the pace for the entire computing industry. Each year, firms such as Intel and the Taiwan Semiconductor Manufacturing Company spend billions of dollars figuring out how to keep shrinking the components that go into computer chips. Along the way, Moore’s law has helped to build a world in which chips are built in to everything from kettles to cars (which can, increasingly, drive themselves), where millions of people relax in virtual worlds, financial markets are played by algorithms and pundits worry that artificial intelligence will soon take all the jobs.

But it is also a force that is nearly spent. Shrinking a chip’s components gets harder each time you do it, and with modern transistors having features measured in mere dozens of atoms, engineers are simply running out of room. There have been roughly 22 ticks of Moore’s law since the launch of the 4004 in 1971 through to mid-2016. For the law to hold until 2050 means there will have to be 17 more, in which case those engineers would have to figure out how to build computers from components smaller than an atom of hydrogen, the smallest element there is. That, as far as anyone knows, is impossible.

Yet business will kill Moore’s law before physics does, for the benefits of shrinking transistors are not what they used to be. Moore’s law was given teeth by a related phenomenon called “Dennard scaling” (named for Robert Dennard, an IBM engineer who first formalised the idea in 1974), which states that shrinking a chip’s components makes that chip faster, less power-hungry and cheaper to produce. Chips with smaller components, in other words, are better chips, which is why the computing industry has been able to persuade consumers to shell out for the latest models every few years. But the old magic is fading.

Shrinking chips no longer makes them faster or more efficient in the way that it used to. At the same time, the rising cost of the ultra-sophisticated equipment needed to make the chips is eroding the financial gains. Moore’s second law, more light-hearted than his first, states that the cost of a “foundry”, as such factories are called, doubles every four years. A modern one leaves little change from $10bn. Even for Intel, that is a lot of money.

The result is a consensus among Silicon Valley’s experts that Moore’s law is near its end. “From an economic standpoint, Moore’s law is dead,” says Linley Gwennap, who runs a Silicon Valley analysis firm. Dario Gil, IBM’s head of research and development, is similarly frank: “I would say categorically that the future of computing cannot just be Moore’s law any more.” Bob Colwell, a former chip designer at Intel, thinks the industry may be able to get down to chips whose components are just five nanometres apart by the early 2020s – “but you’ll struggle to persuade me that they’ll get much further than that”.

Read the full article.

 

 


Recent Posts

Character-driven journalism is not new to newspapers, though it once was. It was once called The New Journalism in the 1960s — see Truman Capote or Tom Wolfe. Today it is industry standard. Why not take the Sopranos or Breaking Bad formula and marry it to TV journalism? (How many interviews have you seen in The Sopranos? How many Man on the Street soundbites have you seen in Breaking Bad?)


In a recent study by The Reuters Institute, 40% of Americans no longer watch or read the news at all. They find it too depressing. All doom and gloom.


There is a great deal of concern, well placed, that few people under the age of 30 watch TV news. Viewership of TV news in general has fallen off, so naturally, TV executives across the boards are searching for a solution. How to appeal to a demographic that spends most of their time on social media?


Share Page on: