Moore’s Law Turns 50 as Computing Power Doubles Every Year

The Associated Press
The Associated Press

Just as the Dot-com bubble was popping in May 2000, the highly respected MIT Technology Review published an article, “The End of Moore’s Law?”, that claimed computing power could not continue to double each year, because engineers were no longer able to “cram an ever-increasing number of electronic devices onto microchips.” But after 50 years of unabated annual doublings of computer power on chips, there is still no sign that Silicon Valley innovations are slowing or that Moore’s Law will expire.

As the 36-year-old head of Research and Development at Fairchild Semiconductor, Gordon Moore published a prediction on April 19, 1965 in a struggling magazine with a modest following called Electronics that claimed the number of components–that is, transistors–on a single computer chip would continue to double every year, while the cost per chip would remain constant.

Moore wrote, “Integrated circuits will lead to such wonders as home computers–or at least terminals connected to a central computer–automatic controls for automobiles, and personal portable communications equipment.”

Few paid any attention to Moore’s prediction and he later admitted he was “just trying to get across the idea this was a technology that had a future.” The term “Moore’s Law” was actually coined a decade later by Caltech Professor Carver Mead. Moore in 1965 had just designed the world’s most complex chip: It had 64 transistors.

Within a short time of the article becoming “Moore’s Law,” it became the core business model of semiconductor and fabrication companies as well as the foundational logic driving long-term strategy for corporations in the microchip and semiconductor business.

The MIT Tech Review in 2000 thought Intel’s Pentium III chip, containing 28 million transistors, would mark the peak of exponential annual processing growth. At the time, doubling a chip’s transistor count was directly correlated to providing extra cache, additional CPU cores, and an on-die memory controller.

Although Intel’s 28 nm process technology for 2015 only contains 10 billion transistors, packing more transistors into a smaller space is no longer necessarily correlated to higher performance. Additional transistors might now be spent on sophisticated power gating logic and added blocks within the “system on chip” (SoC) that don’t directly impact traditional performance metrics. Or they might be spent on capabilities like “big. LITTLE,” a method of improving SoC power efficiency by mixing low-power and high-power cores. Newer chip design strategies also push for higher burst frequencies.

At 86 years old, Gordon Moore looks back on his career and reminisces that the only industry “remotely comparable” in its rate of growth to the microchip industry is the printing industry. He commented that, “Individual characters were once painstakingly carved out of stone; now they’re whooshed out by the billions at next to no cost.”

Moore believes that like microchips, printing utterly transformed society. That is why Moore compares himself to printing press inventor Johannes Gutenberg, who in 1455 could never have imagined how much his invention would change the planet.

The reliability of Moore’s Law to make technology more powerful and cheaper each year has shaped consumers’ expectations. The customers now expect “stuff” like cloud computing, the internet, social media, search, streaming video, and more to become faster, cheaper and more compact in step with Moore’s Law.

Dan Hutchenson, head of chip market research outfit VLSI Research, told Wired magazine that the market value of the companies across the spectrum of technologies beholden to Moore’s Law amounted to a whopping $13 trillion in 2014–20 percent of the asset value of the entire world’s economy.

Steve Brown, a strategist with Intel, sees no slowdown in the continuing exponential growth of computing power. He looks forward to Moore’s Law-like transformations in industries such as healthcare, pharmaceuticals, and genetics. He expects that a continual growth in computing power will make it possible for new drugs to eventually be fully designed and tested “in the minds of computers.”

COMMENTS

Please let us know if you're having issues with commenting.