19 Apr 2015
Moore’s Law at 50
On the fiftieth anniversary of the publication of Gordon Moore’s paper that gave rise to Moore’s Law, our editor, Ian Poole, looks at how it has affected the electronics industry and what the future holds.
When Gordon Moore published his paper on the development of integrated circuit technology, little did he realise that fifty years later, people would still be using it as a guide to the development of technology.
The paper, entitled: “Cramming more components onto integrated circuits" was published in Electronics on 19 April 1965. It was the distillation of an internal paper that Moore had written.
What is Moore’s Law?
Moore’s Law is widely known in the integrated circuit and computing industry.
"Moore's Law" is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit has doubled approximately every two years.
The observation is named after Gordon E. Moore, co-founder of the Intel Corporation and Fairchild Semiconductor, who wrote a paper that was published and in which he investigated the growth of the number of transistors in integrated circuits.
To this day, Moore’s Law has been a guideline that has broadly held true for many years in the fast changing industry and it is now seeing its 50th anniversary.
The 50th anniversary of Moore’s Law is a major milestone, because trends and extrapolations normally do not hold for long in such a fast changing industry.
Moore’s Law - background
Gordon Moore was born in San Francisco, California, USA, and received his bachelor’s degree in Chemistry in 1950 from University of California, Berkeley, after which he moved to the California Institute of Technology gaining his PhD in 1954. His next move was to the Applied Physics Laboratory at Johns Hopkins University where he undertook post-doctoral research between 1953 and 1956.
After leaving the Applied Physics Laboratory, Moore moved to the Massachusetts Institute of Technology where he worked with William Shockley at the Shockley Semiconductor Laboratory division of Beckman Instruments. However he left here when he and eight others were backed by Sherman Fairchild to create Fairchild Semiconductor Corporation.
As Director of Research and Development, Moore had been investigating the trends of the growth of integration levels within ICs. For some time he had been developing his ideas and wrote an internal paper in which extrapolated a line through five points on a graph of the number of components per integrated circuit for minimum cost per component developed between 1959 and 1964.
In his internal paper entitled: “The Future of Integrated Electronics” he attempted to look at the development of integrated electronic circuits for a period of about the following ten years.
In his predictions, Moore projected that the number of components on an integrated circuit would double every twelve months, reaching 65 000 by 1975.
Gordon Moore was later asked to write his findings in an article for Electronics Magazine. The magazine article was entitled: “Cramming more components onto integrated circuits" and it was published on 19 April 1965.
Later in 1975, Gordon More, then as a Director with Intel noted that his prediction had been realised, but he also slowed the rate at which he predicted the growth would continue from then on. Basing his new predictions on the fact that processors were less dense than memories and there was a high mix of ICs to be accommodated, he adjusted his predictions to a doubling in complexity every two years as opposed to every year.
Moore’s Law has been one of the guiding principles for the electronics industry for many years. It has tended to set the expectations not only for the IC design and fabrication industry, but also those designing equipment and even software.
Those within the industry have held onto Moore’s Law using it as a guiding principle for the past fifty years. But what are their views today?
Mike Salas, VP Marketing Ambiq Micro commented: “Moore’s Law is perhaps the most well-known guiding principle in the history of the semiconductor industry. However, it was the breakdown of a much lesser known law – Dennard scaling – that really puts into question the long-term viability of Moore’s Law.”
He explained: “Dennard scaling essentially stated that as transistors got smaller, their power density remained constant. Put another way, if there was a 2x reduction in transistor size (which would enable a doubling of the number of transistors on a chip as per Moore’s Law), Dennard scaling correspondingly called for the transistor power to be reduced by 4x (with both voltage and current being halved). Unfortunately, about 10 years ago this relationship broke down. Therefore, while it is still possible to double transistor count every two years, it is no longer possible to keep running these transistors at higher speeds due to inability to correspondingly drop the voltage and the current they need to operate reliably.”
Salas continued: “The breakdown in this relationship has led to an interesting dichotomy. On one hand, since the portion of Moore’s Law that enables a doubling of transistor counts to continue unabated, the industry still benefits immensely from the ability to develop increasingly smaller products. However, the inability to maintain the Dennard scaling relationship has led us to a point where energy consumption concerns have put an increasing amount of pressure on these product designs – which is only compounded as form factors continue to shrink. Thus, it is our belief that reducing energy consumption has actually replaced performance as the foremost challenge in electronic design since without addressing these energy problems, it is simply not possible to continue to rely on Moore’s Law. We believe that our unique subthreshold voltage technology offers one way out of this dilemma since by setting a goal reducing the power consumption by at least half (if not more) every two years, we can at least work to narrow the ever-widening gap that exists between Moore’s Law and Dennard scaling.”
One of the issues that Gordon Moore had to accommodate in his predictions was the wide variety of ICs that were being developed. Although Moore in 1975 commented specifically on the difference between microprocessors and memories when he revised his estimations, nowadays there is an even greater variety now with ICs like mixed signal ICs.
Alessandro Piovaccari, Senior Vice President and Chief Technology Officer, Silicon Labs said: “Mixed-signal ICs, defined as semiconductor devices that integrate significant analog and digital functionality, account for about one tenth of the global semiconductor market. Their success is driven by the fact that high levels of mixed-signal integration reduce overall system cost and greatly simplify the engineering required by system manufacturers, enabling them to focus on their core applications and get to market faster.”
Piovaccari continued: “Moore’s Law, which has been remarkably consistent for digital circuit design, doubling the number of transistors in a given area every two years, does not generally apply as well to analog circuits, where process technology scaling is mainly driven by the quality improvements of the process itself. Instead of relying on analog scaling, a more effective technique is to use a digital-centric approach leveraging the powerful capabilities of digital processing in fine-line digital CMOS processes to calibrate and compensate for analog imperfections and mitigate unwanted interactions. This approach improves the speed, precision, power consumption, and ultimately the cost and usability of mixed-signal devices.”
Moore’s Law effect on equipment and applications
Moore’s Law has not only been used as a guiding principle for the IC developers themselves. Those using the equipment and developing products have also looked to Moore’s Law to be a guiding principle about the levels of processing and integration they can expect.
Co-founder, President and CEO of National Instruments, Dr Tuchard said: Over the last 50 years, Moore’s Law has driven significant advances in the technology that has gone into first PCs and now smart mobile devices and many other electronics. These advances have disrupted many traditional solutions. In each case, the role of software has grown in importance - with software-defined hardware platforms replacing the functionality of solutions that were once accomplished with dedicated and single-function hardware.”
Dr Truchard, Co-Founder President and CEO of National Instruments
Truchard continued: “This is also true in test and measurement, where a combination of ever improving CPU, FPGA, and ADC/DAC technologies and sophisticated application software is enabling the replacement of traditional test and measurement equipment. The result is higher performance and lower cost instrumentation. In fact, today’s PC-based, software-defined instruments not only achieve better analog performance than traditional instruments of the past – but consolidate multiple instrument functions into a single device.”
“Further, a software-based approach enables engineers to transparently upgrade systems to higher-performance hardware as Moore’s law advances. This allows them to preserve their software investment while gaining access to the latest measurement technology.”
Truchard further explained: “As a practical example, we routinely see customers improve the measurement speed of automated test systems by upgrading them to use the latest multicore CPU. Although the CPU is a small and relatively inexpensive element of the test system – the ability of an engineer to improve the performance of a measurement system while preserving their software investment is tremendously powerful. In fact, some system design software tools such as LabVIEW are inherently designed for parallel execution and immediately benefit from an increasing number of processing cores.”
“Just as we once saw the transistor and microprocessor obsolete vacuum tube-based instrumentation in the 1960s, we now see a similar technology transformation happening today. In the new transformation, the flexibility and raw performance of software-defined instruments are causing the obsolescence of traditional, fixed-function test and measurement equipment. Riding the wave of technologies that Moore’s law provides, software is driving a new era of instrumentation with higher performance and lower cost.”
Future for Moore’s Law
For many years people have been talking about the demise of Moore’s Law. ICs cannot be scaled down in size for ever, and in many ways they have already reached the physical limits. They have gone well beyond the bounds thought possible may years ago.
For the future, the Moore’s Law will still be as influential as ever. The pure physics may not apply in the same way that it has in the past but the underlying economics will remain as a fundamental cornerstone of the electronics industry driving innovation forwards and increasing equipment functionality. With chips moving from planar technologies towards 3D interconnected stacks and with other new materials and techniques coming on line, the spirit of Moore’s Law will remain at the centre of the electronics industry.
Page 1 of 1
About the author
Ian Poole is the editor of Radio-Electronics.com. Having studied at University College London to gain his degree he went on to undertake a career in electronic development working for companies including Racal. He became the hardware development manager at Racal Instruments where he was in charge of the hardware development activities within the company. Later moving in to freelance work as a consultant he also developed Radio-Electronics.com to become one of the leading publications for professional electronics engineers. He is also a Fellow of the Institution of Engineering and Technology and is the author of over 20 books.
Most popular articles in Electronics components
Share this page
Want more like this? Register for our newsletter