Beyond Moore computing

By Stephanie Holinka

Photography By Randy J. Montoya

The Beyond Moore Computing Research Challenge looks at when the techno-economic theory that has proven true and driven the electronics revolution could come to an end. 

Moore’s Law is named after Intel co-founder Gordon Moore’s prediction in a 1965 paper that the number of transistors per square inch of integrated circuit would double every year. He said at the time that the growth rate would continue for the foreseeable future.

For decades, Moore’s Law steadily increased microelectronic chip performance and gave people the integration of video recording, personal digital assistants, computers, radios, clocks, GPS and more into smaller and smaller devices at relatively low consumer cost and without an increased need for power in the devices. But it enabled far more than that; it produced an electronics golden age. Moore’s Law is more an economic theory than a technological one since it relates to the innovation cycle that brings increasing computing power to the world. “Right now, companies fund increasingly large investments in R&D, leading to a technical leap in chip performance, which can then sell for a lot of money, which funds the next investment and the next technical leap,” says Research Challenge senior manager Rick McCormick. “After dozens of these cycles over the last five decades, the leaps are getting much harder and orders of magnitude more expensive, so it’s increasingly difficult.”

A growing information economy

green neocortex
Sandia researchers are studying the brain, including these green fluorescent protein-labeled neurons in a mouse neocortex, with the aim of developing neuro-inspired computing systems.

Consumers tend to view the technology progression a “law” and want more and more. Computing is woven into almost all aspects of business and personal lives and the growth in consumption is exponential. “Throughout the first couple dozen innovation cycles, we got the performance increase using the same power, and at little increase in cost to the consumer. Who could resist that? It has enabled an information age and our growing information economy,” McCormick says.

But over the past decade, the $300 billion semiconductor industry started bumping up against both economic and technical speed bumps on the information highway. Dennard scaling, which kept the power density on a chip constant while the number of transistors and performance increased, began failing in the 2006 timeframe. This stopped the increases in clock speeds that marked the prior cycles. Chip performance improvements have continued but are now due mainly to just having more transistors and clever chip architectures (more processing “cores”). Transistor scaling and novel micro-architectures could continue current performance improvement trends for another six to10 years, though fabrication costs may slow the cycles. But projected trends of “computing consumption” may outstrip this progress.

“Without something new, in about 15 years, information and communications technology could go from using 3.5 percent of the world’s electricity to nearly 30 percent,” McCormick says.

“We need a new path to scaling, to find a new way to support steady performance and energy improvements, as we’ve grown accustomed to. That’s a lofty goal,” McCormick says. “It took us about 60 years to get here. We can’t afford another 60 years to find the new approach, or suffer the economic and national security risks associated with getting left behind.”

Computing beyond Moore’s Law in the 2030s will take revolutionary new devices, manufacturing processes, architectures and algorithms in addition to the projected progress of current approaches. There is a lot of research going on worldwide around the problem, including at Sandia. 

Tough, revolutionary innovations

Kathye Chavez
Kathye Chavez works on Sandia’s Red Sky supercomputer, one of the fastest computers in the world.

“It’ll take a lot of research — deep material science, device physics, circuits, chips, packaging, component qualification, architectures, and Sandia does all of that at a fairly impressive level,” McCor-mick says. “But Sandia really excels at taking tough, revolutionary innovations and making them practical. At our heart, we’re an engineering lab. We do a lot of science, but where we excel is at transitioning fundamental science into applications, things like figuring out how to manufacture, even at low volume, and figuring out how to make things reliable and resilient. We do a better job at coupling together the pieces, and that’s unique.”

The Research Challenge has held two town hall meetings, and a team of several dozen Sandia scientists is looking to develop a method to unify a number of the disparate research efforts. The aim is to make the research more efficient by systematically validating early stage work prior to the projects undergoing years of research. This would allow researchers to know early in the process if a new device proposal would have a large energy impact on a particular architecture running a particular application. Sandia has broad and deep expertise at all levels of this framework and a proven ability to merge modeling and experiment to deliver the sort of multi-scale design codes that could accelerate research in the area. 

The Research Challenge has rapidly expanded, including opportunities to help lead a national lab team that presented the energy challenge at the Energy Department’s Big Idea Summit last spring. It was well received and continued collaborations be-tween the labs have led to discussions with industry, academia and other government agencies about potential initiatives to address this critical issue.