FOR IMMEDIATE RELEASE June 5, 1996
ALBUQUERQUE, NM -- A computer program revised to function up to 300 times faster, the better to simulate radiation effects of nuclear explosions, has improved the capability for more precise radiation treatment for cancer patients, say scientists at Sandia National Laboratories.
The sped-up program, which makes possible the use of a complex statistical method, also should improve analyses of nuclear reactor safety and the vulnerability of orbiting satellites to radiation from the Van Allen radiation belts in outer space, as well as aid in the detection of military landmines
Electron-beam welding and high-temperature ceramic joining are among other processes that should benefit from the advance.
Sandia computational scientist Greg Valdez modified a Sandia program intended to run on a single computer so that it distributes work among a number of computers working simultaneously toward a solution, a method called parallel processing.
The program he adapted is the Integrated Tiger Series (ITS), available publicly since 1985.
The complex statistical method is called Monte Carlo electron-photon transport, the most effective tool to simulate the interaction of radiation -- composed of billions upon billions of particles -- with complex material objects.
Faster simulations of the effects of nuclear explosions lessen the need for underground tests forbidden by international treaty.
“These are merely a few of the more important examples from among many simulations made more feasible that otherwise would have been difficult, if not impossible, without the advance in computational speed,” says Sandia physicist John Halbleib.
The Monte Carlo method in principle can be used to maximize the dose to cancerous tissue while minimizing the dose to healthy tissue. But in clinical settings, at current computational speeds, it takes a “prohibitive” amount of computing time to lessen “the statistical uncertainties inherent in the method,” says Halbleib.
Says Valdez, “Tumors change shape and position over a waiting time of weeks or even days. So their treatment by medical therapists must employ less accurate, but faster, alternatives that combine Monte Carlo data obtained from much simpler geometries.
“Our new code system will permit hospitals to do a more detailed and accurate analysis at a speed closer to real-time, when the code becomes publicly available,” says Valdez.
Valdez's parallelization program, in a series of tests at the Maui High Performance Computing Center in early 1996, was able to increase the speed of computation nearly linearly with each processor activated on an IBM SP2 until approximately 180 processors were included. The code then solved problems 165 times faster than a single computer. Adding still more processors -- the machine has 400 -- increased the computational speed further, but at a slower rate. For example, the process, which was 92 percent efficient at 180 processors, dropped to only 77 percent efficiency with 360 processors. However, the increase in computational speed was still dramatic -- 278 times faster than one computer working alone.
The code's answers check out to eight decimal places with the simpler, single-computer TIGER code used as a benchmark.
The codes are being modified further for defense purposes to run on still-faster computers that reach teraflop speeds “to take advantage of the hardware developed for the Department of Energy's Accelerated Strategic Computing Initiative,” said James R. Lee, manager of Sandia's simulation technology research department.'
The earlier code is used by at least several hundred institutions worldwide. It is distributed by the Department of Energy's computer code center at Oak Ridge National Laboratory and the NEA Data Bank in France.
“The largest number of users are concerned with weapon radiation effects work or medical physics for cancer treatment,” says Halbleib, an original developer of the ITS system. “One of our objectives is to make this system more practical in a clinical setting.”
The parallel program will run any combination of UNIX workstations that use TCP/IP (transmission control protocol/ internet protocol) with a publicly available program called PVM. This creates a virtual parallel-processing computer of any number of computational units, or nodes. All the computational units need not be located directly at the hospital or clinic, but may be joined from different locations via telephone lines.
The cooperation of the Department of Defense, the Air Force's Phillips Laboratory, and the University of New Mexico in permitting use of the massively parallel Maui machine was essential, says Valdez: “The efforts of the entire staff at the Maui High Performance Computing Center, especially Dr. Lon Waters, an applied mathematician, were crucial.”
Sandia National Laboratories is a multiprogram national laboratory operated by a subsidiary of Lockheed Martin Corporation for the U.S. Department of Energy. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has broad-based research and development programs contributing to national defense, energy and environmental technologies, and economic competitiveness.
Media Contact: Neal Singer, 505-845-7078; internet: email@example.com
Technical Contact: Greg Valdez, 505-845-7798, firstname.lastname@example.org; John Halbleib, 505-845-7064, email@example.com; Ron Kensek, 505-845-7642, firstname.lastname@example.org
Illustrative graphics available upon request.Neal Singer, email@example.com
Last modified: June 5, 1996
Sandia National Laboratories is operated by Lockheed Martin Corp. for the U.S. Department of Energy.