Moore's law

The observation that the number of transistors in dense integrated circuit doubles about every two years.

Moore's law is the observation that the number of transistors in dense integrated circuit doubles about every two years. The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and was the CEO of Intel, whose 1965 paper described a doubling every year in the number of components per integrated circuit, and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years, a compound annual growth rate (CAGR) of 40%.

The doubling period is often misquoted as 18 months because of a prediction by Moore's colleague, Intel executive David House. In 1975, House noted that Moore's revised law of doubling transistor count every 2 years, in turn, implied that computer chip performance would roughly double every 18 months (with no increase in power consumption). Moore's law is closely related to MOSFET scaling, as the rapid scaling and miniaturization of metal–oxide–silicon field-effect transistors (MOSFETs, or MOS transistors) is the key driving force behind Moore's law.

Moore's prediction proved accurate for several decades and has been used in the semiconductor industry to guide long-term planning and to set targets for research and development (R&D). Advancements in digital electronics are strongly linked to Moore's law: quality-adjusted microprocessor prices, memory capacity (RAM and flash), sensors, and even the number and size of pixels in digital cameras. Digital electronics have contributed to world economic growth in the late twentieth and early twenty-first centuries. Moore's law describes a driving force of technological and social change, productivity, and economic growth.

Moore's law is an observation and projection of a historical trend. It is an empirical relationship and not a physical or natural law. Although the rate held steady from 1975 until around 2012, the rate was faster during the first decade. In general, it is not logically sound to extrapolate from the historical growth rate into the indefinite future. For example, the 2010 update to the International Technology Roadmap for Semiconductors predicted that growth would slow around 2013, and in 2015, Gordon Moore foresaw that the rate of progress would reach saturation: "I see Moore's law dying here in the next decade or so."

Microprocessor architects report that semiconductor advancement has slowed industry-wide since around 2010, below the pace predicted by Moore's law. Brian Krzanich, the former CEO of Intel, announced, "Our cadence today is closer to two and a half years than two." Intel stated in 2015 that improvements in MOSFET devices have slowed, starting at the 22 nm feature width around 2012, and continuing at 14 nm. Krzanich cited Moore's 1975 revision as a precedent for the current deceleration, which results from technical challenges and is "a natural part of the history of Moore's law". Leading semiconductor manufacturers, TSMC and Samsung Electronics have the 10 nm and 7 nm FinFET nodes in mass production, and 5 nm nodes in risk production.

Technological change is a combination of more and better technology. A 2011 study in the journal Science showed that the peak of the rate of change of the world's capacity to compute information was in 1998 when the world's technological capacity to compute information on general-purpose computers grew at 88% per year. Since then, a technological change clearly has slowed. In recent times, every new year allowed humans to carry out roughly 60% more computation than possibly could have been executed by all existing general-purpose computers in the year before. This still is exponential but shows that the rate of technological change varies over time.

The primary driving force of economic growth is the growth of productivity and Moore's law factors into productivity. Moore (1995) expected that "the rate of technological progress is going to be controlled from financial realities". The reverse could and did occur around the late-1990s, however, with economists reporting that "Productivity growth is the key economic indicator of innovation."

An acceleration in the rate of semiconductor progress contributed to a surge in U.S. productivity growth, which reached 3.4% per year in 1997–2004, outpacing the 1.6% per year during both 1972–1996 and 2005–2013. As economist Richard G. Anderson notes, "Numerous studies have traced the cause of the productivity acceleration to technological innovations in the production of semiconductors that sharply reduced the prices of such components and of the products that contain them (as well as expanding the capabilities of such products)."
log-log plot comparing gate length to node size

An alternative source of improved performance is in microarchitecture techniques exploiting the growth of available transistor count. Out-of-order execution and on-chip caching and prefetching reduce the memory latency bottleneck at the expense of using more transistors and increasing the processor complexity. These increases are described empirically by Pollack's Rule, which states that performance increases due to microarchitecture techniques approximate the square root of the complexity (number of transistors or the area) of a processor.

For years, processor makers delivered increases in clock rates and instruction-level parallelism, so that single-threaded code executed faster on newer processors with no modification. Now, to manage CPU power dissipation, processor makers favor multi-core chip designs, and software has to be written in a multi-threaded manner to take full advantage of the hardware. Many multi-threaded development paradigms introduce overhead, and will not see a linear increase in speed vs the number of processors. This is particularly true while accessing shared or dependent resources, due to lock contention. This effect becomes more noticeable as the number of processors increases. There are cases where a roughly 45% increase in processor transistors has translated to roughly 10–20% increase in processing power.

On the other hand, manufacturers are adding specialized processing units to deal with features such as graphics, video, and cryptography. For one example, Intel's Parallel JavaScript extension not only adds support for multiple cores but also for the other non-general processing features of their chips, as part of the migration in client-side scripting toward HTML5.

A negative implication of Moore's law is obsolescence, that is, as technologies continue to rapidly "improve", these improvements may be significant enough to render predecessor technologies obsolete rapidly. In situations in which security and survivability of hardware or data are paramount, or in which resources are limited, rapid obsolescence may pose obstacles to smooth or continued operations.

Because of the toxic materials used in the production of modern computers, obsolescence, if not properly managed, may lead to harmful environmental impacts. On the other hand, obsolescence may sometimes be desirable to a company that can profit immensely from the regular purchase of what is often expensive new equipment instead of retaining one device for a longer period of time. Those in the industry are well aware of this and may utilize planned obsolescence as a method of increasing profits.

Moore's law has affected the performance of other technologies significantly: Michael S. Malone wrote of Moore's War following the apparent success of shock and awe in the early days of the Iraq War. Progress in the development of guided weapons depends on electronic technology. Improvements in circuit density and low-power operation associated with Moore's law also have contributed to the development of technologies including mobile telephones and 3-D printing.

Adapted from content published on
Last modified on March 4, 2020, 4:25 am is a service provided by Codecide, a company located in Chicago, IL USA.