Richard (Rich) Murphy (1422) has been identified as a “person to watch,” not by the CIA but by the relatively venerable online computing magazine HPCwire, which each year names a handful of researchers its editors believe to be doing the world’s most interesting work in supercomputing.
Rich is principle investigator for Sandia’s X-caliber project, a DARPA-funded high-performance computing effort to radically lower the power usage of computer systems at all scales by 2018.
“If we don’t solve the power problem,” Rich says, “we’ll have to stop building bigger, faster supercomputers, or they’ll become resources that cost as much to use as superconducting supercolliders, which will really limit their impact.”
Rich also led the launch this year of the newly created Graph500 test, an internationally used benchmark tool that offers an alternative to the Linpack500 in measuring the ability of computers to manipulate large-scale data sets (Lab News, Nov. 3, 2010, and Dec. 6, 2010).
“In the past, we designed supercomputers to do physics — that’s why FLOPS are so important — but this new kind of test measures memory access and the ability to marshal huge data sets efficiently,” he says. “Graph500 is a test for a totally new area.” Such areas can be found, for example, in following the huge number of barrels of oil in transit around the world today in ships, or keeping track of the medical records of every patient in the US.
In a sense, Rich says, the goal is to go from using supercomputers to simulate a hypothesis to building supercomputers capable of generating a hypothesis.
“In the example of medical informatics, we know that genetics plays a role in how certain drugs or courses of treatment work. When moving these things from clinical trials to much larger populations, these techniques could be used to figure out how to personalize courses of treatment based on genetic or environmental factors. We could use knowledge discovery to figure out in very specific populations how effective a new medicine is, and actually recommend courses of action.”
The transition to exascale will be challenging, Rich says. “Unlike the tera-to-petascale transition, we know we can’t just scale commodity architectures: the barriers have to do with fundamental physics. Perhaps even more significantly, the tasks we want the computer to achieve are changing. It’s not just 3-D physics anymore. This changes the computer’s architectural requirements and how we design the system.
“But I think we have to have this capability to maintain our national competitiveness.”
To come up with such thoughts, it helps to have had a nerdish childhood. Rich built his first network protocol and programming language in high school, and had an early idea of building a three-dimensional online world with the goal of selling stuff on it — “think Second Life crossed with Amazon before Second Life existed”— but that bubble burst before he could implement it, he says.
He’s one of the few people in the 168-year history of Notre Dame to hold four degrees from that institution — a Bachelor of Science in computer science, a Bachelor of Arts in government, and a master’s and doctorate, both in computer science and engineering.