News

October 4, 2013

Stronglinks: Mechanisms that help ensure nuclear weapons remain safe

INSPECTION — Ray Ely, left, and Dennis Kuchar (both 2613) inspect launch accelerometer hardware before assembling units for critical flight tests. Their work is part of Sandia’s long-time effort on stronglinks. (Photo by Randy Montoya)

by Sue Major Holmes

Three engineers lean over a workbench in a lab for Sandia’s Integrated Surety Mechanisms, adjusting levers on a scale model as they check a design. In this case, the plastic model is several times larger than the real thing — tiny weapon parts that are hand-assembled, sometimes using a microscope and tweezers.

NNSA’s Kansas City Plant has built its 1,000th stronglink for the W76-1’s arming, fuzing, and firing (AF&F) system. A stronglink mechanism is aimed at improving the safety of nuclear weapons without compromising their reliability. The first stronglinks were designed at Sandia in the 1960s, based on the premise that weapons must always work when needed, but never work otherwise.

“It’s all about never having a nuclear detonation unless the president has released the weapon for war,” says Integrated Surety Systems manager Marcus Craig (2613).

Weapons in the US stockpile are a deterrent “because our adversaries know they’ll work,” he says. “If they thought we were launching a weapon and there was any chance it wouldn’t work, they’d plan differently.”

The W76-1/Mk4A warhead, deployed on the Trident II D5 Special Weapon System on Ohio-class submarines, uses stronglinks in the AF&F, which provides the arming, fuzing, and firing functions of a nuclear weapon. Stronglinks also will be included in modernization programs for the W88 Alt and B61 Life Extension Program (LEP).

Incorporating safety components

An AF&F consists of an impact fuze, an arming subsystem which includes the radar, a  firing subsystem, and a thermal battery that powers the system. The firing subsystem, which provides energy to detonators, incorporates three safety components: a launch accelerometer and two stronglinks.

The missile’s boost phase causes the launch accelerometer to close multiple circuits, but won’t close them if the missile doesn’t perform as required. Once the circuit closes, the missile sends a signal, which then provides power to wake up and operate the system.

Stronglinks are in place in case something goes awry before or after launch, such as an accident that bypasses the launch accelerometer or other parts of the system and mistakenly supplies power that could set off the detonators, Marcus says.

He likens the firing system to a vault. Each stronglink receives a unique signal that opens the vault doors — shutters inside the weapon, one electrical and the other magnetic. Opening the shutters allows energy from the thermal battery to flow to a capacitor, which stores the energy capable of initiating the detonators. “That does not happen unless we get the right very, very unique signals that allow the shutter to open,” Marcus says.

These safety components use mechanical parts because such parts are highly predictable in an accident. Like any system, the more that goes in, the more opportunity there is for something to go wrong.

“So our challenge is trying to provide this required safety without affecting reliability,” Marcus says.

Effort involves mechanical engineering

The work touches many mechanical engineering disciplines. “We are very broad across disciplines, and because it’s a safety component we have to be deep in all these areas, too,” Marcus says.

It’s hard to imagine a better situation for mechanical design engineers, he says.

“We use fluid mechanics, we use strength of materials, we use materials science,” he says. “We have to understand the effects of manufacturing processes: welding, machining, soldering, encapsulation. We do a lot of complex tolerance analysis, we get into dynamics. As a mechanical engineer, all those classes you take in mechanical design, we practice most of them.”

With one generation of weaponeers retiring, the new guard takes over significant responsibilities, Marcus says.

To illustrate, he picks up a glass jar of jelly beans from a table in his office, describing it as a widget that includes many green, red, white, yellow, and multicolor parts, plus the glass that encases them, a sticker, and the lid. All of it requires a profound understanding of the individual pieces and how they work together as a final product.

“Some of these parts are nuclear safety-critical, and the engineers have to identify those parts and ensure that we rigorously understand their design and the materials and processes used in manufacturing the design. With this knowledge, we are confident the white jelly bean with a bit of yellow is exactly what it’s supposed to be. Then when the final assembly passes the rigorous tests we have in place, our product is ready to be integrated into the weapon that becomes part of the nation’s stockpile,” he says.

As an example, consider the jelly bean jar as a stand-in for the electrically activated stronglink, and think about an accident that crushes part of the weapon and burns away parts of its exterior. “What are we going to do to make sure that our safety components continue to keep the vault doors closed, even when there’s an accident? It takes a lot of analysis, knowledge, and engineering judgment,” Marcus says.

His young staff must design against the threat and ensure that the component’s function is not compromised — even in an accident — while meeting budgets and delivery schedules, he says.

Reliability requirements

Development work to modernize the W76-1 began about 2000. Production began in 2006 and will continue for years. Manufacturing the AF&F is complicated by the fact that reliability is paramount and every part must meet stringent performance and safety requirements.

“The ability to produce a design that’s manufacturable, repeatable, doesn’t affect reliability, etc., it’s pretty challenging,” Marcus says.

“We’re building hundreds and hundreds of these things and they all have to be identical and they all have to work,” he says. “It adds another layer of complexity to the design process.”

Every step is done by humans, not robots, from cleaning and testing parts to welding and assembling parts to encapsulating, soldering, and machining parts, Marcus says.

Improving production, design systems

The W76-1 program office holds weekly videoconferences with its Kansas City Plant counterparts and monthly reviews at Kansas City to address the current production status and issues. Studying the production process provides important insights that allow Sandia designers to improve their new designs for programs like the W88 Alt and the B61 LEP.

“We recently had a special review that asked the question, How is our production going, where are we seeing the challenges? Where are we finding ourselves revisiting manufacturing issues over and over again? Is there a recurring theme?” Marcus says. 

Production experience and reviews have shown the Sandia team where designs can be difficult to manufacture, and they help identify ways to improve those designs in the future, Marcus says. “We’re learning that there are opportunities with development programs now to really flesh out precisely what we’re measuring, what features in the design affect those measurements, and where small changes can make the design easier to manufacture and really provide benefit throughout the production cycle.”

The reviews have concluded the electrical stronglink design, with 180 unique parts and 250 separate pieces, is robust and manufacturable, and that small changes can make the next design even better.

-- Sue Major Holmes

Back to top of page

Seeking solutions to Southwest water problems

N.M. Sen. Tom Udall, in opening remarks at the water roundtable, called for better coordination among large-scale water users, better sensors to detect water leaks as they happen or even before, and more research to determine the limits to Southwestern population growth. (Photo by Randy Montoya)

by Neal Singer

The day-long talks and panel sessions on “Transformational Solutions for Water in the West” roundtable, sponsored by Sandia and two other agencies, did not come up with a dramatic, save-the-day answer to the persistent problem of overuse of a diminishing amount of potable water in the American Southwest. But the talks, moderated by Marianne Walck, director of Geoscience, Climate and Consequence Effects Center 6900, and Rob Leland, 1000 director, did reveal a number of partial solutions already in use that, if widely adopted, could make a significant dent in the widely recognized problem.

“There is no silver bullet, but there is silver buckshot,” commented Carlee Brown, policy manager of the Western Governors’ Association on Water and Wildlife.

The scope of the issue was delineated by Howard Passell (6926) in an opening talk.

 “The question is,” he summarized, “are there game-changing transformational solutions that don't just shift the [water-supply] gap; or is it a zero sum game involving an increasing population consuming ever-larger amounts of a shrinking resource?”

Conservation’s role

Presenting a calming view was John Entsminger of the Southern Nevada Water Authority, who said it was clear that increased population growth “doesn’t necessarily track with increased water use.” He cited a 33 percent drop in water use in some Southwestern areas, due to active (though costly) conservation.

Further words of comfort came from Ben Ruddell, a professor at Arizona State University. Answering audience suggestions that water pipes be constructed and placed parallel to oil pipes entering the US from Alberta, Canada, or extend to the Southwest from the Mississippi, Ruddell said that “virtual water” transfer was already in effect in terms of fruits and vegetables grown elsewhere and sent to the Southwest. In this virtual pipeline, water is pre-used, in effect, to grow plants that are then shipped elsewhere. So, in effect, the national highway system is already transferring water from wet to dry places. “Crops are the biggest source of virtual water,” he said, “and virtual water flows uphill to money.”

The new accounting, though thought-provoking, doesn’t solve the obvious problem of sinking water levels in Southwestern reservoirs.

Less palatable measures were reported to put more water in the taps of several localities: the purification of waste water for reuse as drinking water. The city of Albuquerque already purifies waste water for use on golf courses, but other communities have gone further, said a number of speakers. “People in New Orleans are drinking water estimated to have already passed through eight sets of kidneys,” said an audience member.

As unappetizing as that sounds, young people have already accepted the idea of water re-use far more easily than their parents did when it was proposed two decades earlier, said John Stomp, chief operating officer of the Albuquerque/Bernalillo County Water Utility Authority. (A fictional handheld device used to distill potable water from urine in the 1995 movie Waterworld, elicited groans of disgust from an Albuquerque movie audience back then as actor Kevin Costner drank the result.) Stomp also mentioned that “the aquifer was rising under Albuquerque, through three years of the worst drought we’ve ever seen,” because of conservation and diversion efforts.

In a major conservation possibility, Vince Tidwell (6926) said that municipal waste water and brackish groundwater could be substituted for drinking water by retrofitting electrical power generators. His analysis found that more than half the nation’s power plants could be retrofitted, increasing power generation costs by less than 10 percent. “Many of these plants are located in the arid western US,” he said.

Others preached the benefits of cooperation among water agencies, rather than relying on what Max Yeh, principal researcher of the Percha/Animas Watershed Association, described as a “totally unique system of distributing water”— the Southwest water rights system. Privileges granted earliest users offer no incentives to cooperate or even to accept new users.

Agricultural use ‘part of our culture’

But Carlee Brown said, “It’s not a simple matter to move agricultural water to satisfy higher-value demand.” Agriculture and its attendant water use, she agreed, are 75 to 90 percent of water withdrawals, but “are part of our culture.”

The most cost-effective method to deal with a water-shortage problem, she said, involved solving new demands through water transfers. This involves the voluntary sale, lease, or donation of intrastate supplies.

“A water bank can sell or lease water rights for a period of time,” she said. “It’s already occurring in a voluntary, market-based framework. It’s one tool in the tool kit.”

Other solutions, she said, lay in finding new supplies, conservation, re-use, or desalination.

One person pointed out that relying on the social mechanism of raising water prices to cut use would mean that poor people drank less water.

Jesse Roach (6926) agreed that employing a number of strategies “helps spread the pain,” but he pointed out that “evaporative demands rise with temperature,” so that increased evaporation from reservoirs, agriculture, and woodland and streamside areas meant less water available. “Simulations show reservoir levels dropping throughout the system as the impacts of climate change set in.”

More water for drier times?

He did offer the hope that the trend toward bigger storms meant that better controls in reservoirs might mean more water stored for later use in dry times.

Audience member Bill Turner, an Albuquerque businessman, told Lab News he had proposed an actual game-changing design to store Elephant Butte water underground, forestalling evaporation, but the plan was rejected by government agencies.

Mike Hightower (6114) said that “the number, size, and severity of forest fires have grown significantly in the US over the past four decades, and winter ablation in burned areas reduces snowpacks by 50 percent.” Therefore, he said, “A 10 percent reduction in precipitation equals a 20 percent reduction in runoff in the Southwest.”

He pointed out that costs to thin forests can be less than firefighting and damage costs. “If we don’t act, we may lose mountain watersheds and have no national forests in 50 years. But we have the technologies to do the necessary forest thinning.” 

Remarks by Sen. Udall

“The Rio Grande is the only river I’ve ever seen that needs irrigation,” opened keynote speaker Sen. Tom Udall, D-N.M., quoting a line often attributed to Will Rogers. Describing the big river as at times the “Rio Sand,”  the senator — an ardent conservationist — quoted a study to the effect that the current drought — the worst since the 1950s — combined with climate change could cost the nation a trillion dollars in economic losses. He talked about the need for better coordination among large-scale water users, better sensors to detect water leaks as they happen or even before, and more research to determine the limits to Southwestern population growth. While, he said, most observers believe the era of big government investments in dam building is over, he was co-sponsoring a bill to create a “smart water infrastructure” that would, among other benefits, better monitor stream flow and meter water used by irrigators.

-- Neal Singer

Back to top of page

A better benchmark for supercomputer performance

A new benchmark to more accurately measure the capabilities of modern supercomputers has been crafted by Sandia researcher Mike Heroux (1426), in collaboration with the creator of the widely used LINPACK benchmark, Jack Dongarra and his colleagues at the University of Tennessee and Oak Ridge National Laboratory. (Photo by Randy Montoya)

by Neal Singer

A new benchmark to more accurately measure the capabilities of modern supercomputers has been crafted by Sandia researcher Mike Heroux (1426), in collaboration with the creator of the widely used LINPACK benchmark, Jack Dongarra and his colleagues at the University of Tennessee and Oak Ridge National Laboratory.

The new test — a relatively small program called HPCG, for “high performance conjugate gradient” — is being field-tested on a number of NNSA supercomputers. It is expected to be released at SC13, the supercomputing 2013 conference in Denver in November.

Says Mike, “We have known for quite a few years that LINPACK was not a good performance proxy for complex modern applications. But we could still design a cost-effective machine that satisfied two increasingly distinct design points: real application performance and LINPACK performance. Thus we got improvements for our application base and still posted a good TOP500 number [that certified the new machine was one of the 500 fastest in the world].” 

But, he says, the two goals have diverged far enough that, like classroom teachers rebelling against “teaching for the test” rather than improving overall knowledge, “computer designers feel that designing for both is no longer appropriate.” 

The National Nuclear Security Administration has supported work on the new test because, Mike says, “while NNSA realizes it needs to invest in new supercomputers over the coming decades, it is unwilling to spend public money to develop architecture solely to do well on LINPACK. NNSA wants a more meaningful measure. “

 LINPACK’s influential semi-annual TOP500 listing of the 500 fastest machines has been noted worldwide for more than 25 years, initially because it had been considered a simple and accurate metric, readily understood and appreciated by non-experts.

“The TOP500 was and continues to be the best advertising supercomputing gets,” Mike says. “Twice a year when the new rankings come out, we get articles in media around the world. My 6-year-old can appreciate what it means.” 

However, in recent years the gap between LINPACK performance and real applications performance has grown dramatically. 

In the early years of supercomputing, applications and problems were simpler, better matching the algorithms and data structures of LINPACK. Since then, applications and problems have become much more complex, demanding a broader collection of capabilities from the computer system than LINPACK.

“The specifications of the LINPACK benchmark are like telling race cars designers to build the fastest car for a completely flat, open terrain,” says Heroux. “In that setting the car has to satisfy only a single design goal. It does not need brakes, a steering wheel, or other control features, making it impractical for real driving situations.” 

The LINPACK benchmark pushes computer designers to built systems that have lots of arithmetic units but very weak data networks and primitive execution models.

“Because modern applications cannot use all the arithmetic units without better access to data and more flexible execution models,” Mike says, “the extra arithmetic units are useless.” 

Mike led development of the new benchmark, starting with a teaching code he wrote to instruct students and junior staff members on how to develop parallel applications. This code later became the first “miniapp” in the Mantevo project, which recently won a 2013 R&D 100 Award.

The technical challenge of HPCG is to develop a very small program that captures as much of the essential performance of a large application as possible without being too complicated to use. “We created a program with 4,000 lines that behaves a lot like a real code of 1 million lines but is much simpler,” Mike says. “If we run HPCG on a simulator or new system and modify the code or computer design so that the code runs faster, we can make the same changes to make the real code run faster. The beauty of the approach is that it really works.”

HPCG generates a large collection of algebraic equations that must be satisfied simultaneously. The conjugate gradient algorithm used in HPCG to solve these equations is an iterative method. It is the simplest practical method of its kind, so it is both a real algorithm that people care about, and not too complicated to implement.

One basis of the method’s relevance is that it uses data structures that more closely match real applications. The data structures used by LINPACK are no longer used for large problems in real applications because they require the storage of many zero values.  Decades ago, when application problems and computer memory sizes were much smaller, LINPACK data storage techniques were acceptable. Presently problem sizes are so large that data structures are designed to pay attention to what is zero and not zero, which is what HPCG does.

“By providing a new benchmark, we hope system designers will build hardware and software that will run faster for our very large and complicated problems,” Mike says.

Some testing already done indicates the approach will work. More formal testing by November will show whether Mike, Sandia, and its collaborating labs have a product worth its salt.

The HPCG code tests science and engineering problems involving complex equations, and is not related to another Sandia-led benchmark effort known as Graph 500, which assesses and ranks the capabilities of super”‘big data” problems that search for relationships through graphs.

 

-- Neal Singer

Back to top of page

Download Lab News October 4, 2013 (PDF, 2MB)