[About Sandia]
[Unique Solutions]
[Working With Us]
[Contacting Us]
[News Center]
[Search]
[Home]
[navigation panel]

[Sandia Lab News]

Vol. 53, No. 23        Nov. 16, 2001
[Sandia National Laboratories]

Albuquerque, New Mexico 87185-0165    ||   Livermore, California 94550-0969
Tonopah, Nevada; Nevada Test Site; Amarillo, Texas

Back to the Lab News home page

Labs experts study building vulnerability Nanoscience center advances DAKOTA software to be available on web -- for free Soil moisture sensor could aid irrigation


Labs experts helping evaluate security at US chem plants

Back to topBack to Lab News home page.

By Chris Burroughs

With anthrax making its deadly appearance on the East Coast, one question keeps emerging in the minds of many Americans. How can we protect our buildings from chemical and biological attacks?

A Sandia team led by Richard Griffith (9117) has developed modeling and simulation tools for assessing the threat and vulnerability of buildings to such assaults. This includes looking at how chemical and biological agents move and deposit inside a building, developing and assessing mitigation strategies, guiding the use of detection methods, and examining the effectiveness of cleanup and decontamination efforts.

Richard began working on the project following the 1995 sarin gas release in Tokyo's subway system, where it became apparent that chemical and biological attacks by terrorists could be a trend of the future. Over the last few years a large team of Sandians, scattered across several divisions and both sites, has worked to develop and apply these modeling and analysis capabilities. Interest from government agencies and others in the work has skyrocketed since Sept. 11.

Using sophisticated Sandia-developed computer modeling and visualization capabilities, the team can simulate how various chemical and biological agents -- such as anthrax, smallpox, sarin, and mustard gas -- flow through a building and deposit on various surfaces.

"We start by mapping out the building and creating a computational model from the electronic AUTOCAD blueprints, including all the rooms and areas served by each air handler and all the air ducts," Richard says. "Then we simulate the release of a chemical or biological agent directly into different parts of the building, or from the outside for exterior releases."

The computer model, known as KCNBC, predicts where the agent will move as a function of time following its release, producing a movie that gives researchers a view of agent transport and concentration. Simulations include a variety of agent release scenarios using real properties for a number of chemical and biological materials.

Modeling applied to several facilities

This modeling capability has been applied to several facilities, including an eight-story federal courthouse, a military command and control center, and a large airport terminal building.

And since Sept. 11 Richard has received numerous inquiries from government agencies and others about how to assess the chem/bio threat and the vulnerabilities of their buildings.

The information produced by the computer simulations becomes extremely valuable in determining cost-effective mitigation strategies, figuring out where to put agent detection sensors, sensor performance requirements, and deciding on cleanup and decontamination tactics.

In the area of mitigation, for example, the simulations might provide insight as to whether some or all of the air handlers should be turned off in a contaminated building, if it would be effective to purge out the contaminated air and pump in fresh air, if it might be possible to contain or isolate the agent using the HVAC (heating, ventilation, air conditioning) system, or if filters or neutralizers should be used. The benefits of any given mitigation strategy can be assessed to help pick the most useful or cost-effective approaches to protecting the building.

The modeling data could also guide the optimal placement and use of the sensors used to detect chemical and biological agents.

"These new sensors are expensive, and may have significant installation and maintenance costs," Richard says. "If you can have only five or six of them to help protect a large building, you have to figure out the most effective places to put them."

The question then arises of how sensitive the sensors must be and how fast must they respond. For example, asks Richard, would a sensor with a five-minute response time help and where would fast or slow sensors be appropriate? What sensor sensitivity is needed to effectively protect the building by initiating active responses? The modeling and simulation tools help to answer those questions.

Cleanup and decontamination efforts could also benefit from computer modeling. The models can predict agent deposition on floors, walls, ceilings, ducts, and other surfaces in every room of the building, telling researchers where the agent could go and what areas of a building could be the most contaminated. The information could allow cleanup efforts to be focused on the most contaminated areas, giving insight about where cleanup efforts should begin and where they are less needed.

"Smart building"

Finally, Richard says, the modeling and analysis tools are critical technologies in creating a "smart building," one that integrates sensor information and observations from human security to "know" what is happening in and around the building, and then uses predictive modeling and decision-making algorithms to chose the most effective responses to protect building occupants. When sensors detect the presence of an agent, a "smart building" would chose the best HVAC system response to create clear evacuation paths or areas of the building where occupants could take shelter, provide real-time instructions to occupants and first responders, and minimize the amount of the building that was contaminated.

"There are a lot of things we can do to make sure that buildings in America are safe from chemical and biological agent attack," Richard says. "These modeling and analysis tools can play a key role."

-- Chris Burroughs

Back to topBack to Lab News home page.

Sandia/Los Alamos Center for Integrated Nanotechnologies moves forward

Back to topBack to Lab News home page.

By Neal Singer

With an urgency not often seen from Washington, the expected date for the conceptual design report for the Sandia/Los Alamos Center for Integrated Nanotechnologies (CINT) has been moved up from January 2002 to this month.

DOE review of the fledgling program is now expected to take place Dec. 10-11, rather than in February, says Terry Michalske (1040), interim CINT director. The speed-up involves "the need to be prepared to accept FY02 capital funds," he says.

Nanotechnologies operate at sizes approximately a thousand times smaller than microtechnologies. (Nano refers to a nanometer, a billionth of a meter.)

Nanotechnologies are of interest not because they can be used to make very small structures, says Terry, but because the commonly understood properties of ordinary materials change dramatically and, it is hoped, usefully for those who can integrate these new capabilities into the micro or even macroworld.

Silicon, for example, in nanosized clumps emits light, offering a new realm of operation for a mainstay semiconductor material widely known for controlling electrons but not photons. The fluidity and friction of apparently well-characterized materials change unpredictably. Gold and copper become as hard as ceramics.

It's like alchemy

"It's like alchemy," says Terry. "You start with the same material and make it do something completely new. We know a lot, yet much of the nanoworld remains a mystery. This is why nanoscience has captured the interest of the scientific community."

Proof of wide interest in the science and the proposed Center came when a CINT planning workshop, held in Albuquerque Sept. 28-29 and expected to attract perhaps 60 participants, attracted applications from more than 200 scientists from 24 states (48 universities, 14 companies, 12 laboratories) as well as a smattering of researchers from abroad. "People we hadn't accepted appeared at the door, apparently under the assumption that if they traveled a long distance and just showed up, we wouldn't turn them away," says Terry. Hotel capacity regulations limited official attendance to 168.

The 80,000-square-foot Center is expected to be located outside Kirtland AFB north of the Eubank gate and west of the Research Park on DOE-owned land. It will be open to researchers -- visiting scholars, postdoctoral associates, graduate research assistants, and undergraduate interns -- from around the world who have received appropriate DOE approval, says Terry.

Sandia and Los Alamos researchers who could not attend the workshop will be briefed at their respective labs in meetings to take place before the holiday break.

$400 million doesn't get you in door

The amount authorized by DOE to build two facilities -- $45-$50 million in Albuquerque for CINT's core facility, $15-$20 million at Los Alamos for a CINT Gateway -- is not a large amount on the world stage, where European countries and others have committed hundreds of millions of dollars to construct and staff similar centers. What CINT will have that these centers do not is access to already-in-place facilities that, says Terry, dwarf what almost any new center could marshal. "Four hundred million dollars doesn't even get you in the door," he says.

These include, through deliberately constructed "gateways" into the two national labs, access to Sandia's Microelectronics Development Lab and Compound Semiconductor Research Lab, as well as its capabilities in materials synthesis, scanning probes, and theory and simulation computers and personnel. Access through the gateway to Los Alamos means connections with LANL's biosciences resources, its National High Magnetic Fields Lab and Neutron Science Center, its scanning probes, and its theory and simulation centers.

The "gateways" are not mere terms but physical locations -- at Sandia, Bldg. 897, and at Los Alamos, to be built -- at which outside personnel can write proposals and conduct research with CINT staff.

At the core facility off Eubank Blvd., office suites for staff and visitor accommodations will include computer bays and communication links in 15,000 square feet of space. The core facility will also include wings in which to perform nanomaterials synthesis (15,000 sq.ft.), characterization with isolation from vibrations (15,000 sq.ft.), and a Class 1000 clean room with flexible fabrication facilities (14,000 sq.ft.) at which to integrate designs. The core research group will include approximately 40 in-house researchers, 100 researchers from universities, industry, and other locations at the two national labs, 40 postdocs, and 40 undergraduate students.

The road to acceptance of the Sandia/Los Alamos proposal was not without surprises. After the White House Office of Science and Technology Policy initiated a National Nanoscience Initiative in January 1999, and DOE's Basic Energy Sciences office recommended establishing "nanocenters," five groups submitted construction bids. One, Argonne National Laboratory, had already received a substantial promise of funds -- $37 million -- from its home state of Illinois. Still, when the independent evaluation was completed, Argonne was not rated among the top three, though hopes were high for next year. Brookhaven National Lab also did not make the cut. Other "nanocenter" proposals selected were from Lawrence Berkeley and Oak Ridge national laboratories.

A Sandia/Los Alamos/University of New Mexico bid was accepted -- with the proviso that UNM be dropped from the governing board of the Center, because CINT's charter is to be open impartially to researchers from every university. UNM's position as a governing entity would, politicians from other states decided, give it an unfair advantage. The University of New Mexico agreed to remove itself from its governance position in the CINT proposal, but plans to be an active participant in CINT. The Sandia proposal cadre consisted of Charles Barbour (1112), Jeff Brinker (1846), Bruce Bunker (1140), Terry, and Jerry Simmons (1123).

CINT is expected to have four areas of expertise: photonics lattices and quantum clusters; complex self-assembling nanostructures; the mechanics of behavior at the nanoscale; and importation of biological principals and functions into nano- and microsystems.

While research at such tiny dimensions may seem removed from usefulness in the macroworld, a billion times larger in scale, some clues come from biology. "Living systems use nanodevices all the time," says Terry. "An elephant starts out as a collection of nanomachines. The reason a person can take notes at a lecture is that molecules travel along protein walkways that nature organized into an architectural system that allows you to manipulate a pencil. Nature starts with molecules and chemical pathways, and then integrates molecular machines into larger structures -- cell membranes, mitochondria -- that organize into cells at the micron level, and these cells into tissues."

"Our mission at CINT is not just nanoscience," he says. "It's to lay the scientific groundwork for future devices that will change our lives. We have to be on the boundary where science is converted to technology." Just the same, he says, "we need to explore and develop the basic science. We don't even know what the architecture of these larger systems is or what the principles are that govern it. A little nanowalker can't do anything until integrated into the next level of structure.

"Our challenge is to get nanoscience out of the beaker and into the world around us." -- Neal Singer

Back to topBack to Lab News home page.

DAKOTA software to be available for free download on web

Back to topBack to Lab News home page.

By Chris Burroughs

DAKOTA software soon to be available free on web

Weapons designers and analysts often ask themselves questions such as: "What is the best design?" "How safe is my design?" "How much confidence do I have in my answer?" A Sandia-developed software toolkit can help answer these questions and assist engineers in designing anything from components to sophisticated systems.

The toolkit, DAKOTA version 3.0, will soon be on the Web and available for free.

DOE recently granted DAKOTA 3.0 (Design Analysis Kit for Optimization and Terascale Applications) an open-source release under a GNU General Public License. This means any company engineer or university researcher will be able to download DAKOTA and use it to improve their product design or impact their research.

Written in the C++ computer language, DAKOTA provides a flexible interface between the designer's simulation software and the latest algorithms for optimization, uncertainty quantification, parameter estimation, design of experiments, and sensitivity analysis. Interfaces between DAKOTA and user simulation codes can be developed rapidly. To date more than 20 simulator programs have been interfaced with the software.

One of DAKOTA's key features is its ability to use parallel computing resources. For example, DAKOTA was recently interfaced with the SALINAS structural dynamics computer code as part of DOE'S Accelerated Strategic Computing Initiative (ASCI) milepost.

"DAKOTA and SALINAS performed a large weapon component design study on 2,560 ASCI Red processors that accomplished in four days what would have taken more than 10 years to complete on a single workstation," says Mike Eldred (9211), principal investigator.

With its powerful algorithms and ability to manage complex simulations, DAKOTA allows designers to develop virtual prototypes of products that can be modified within the computer to minimize weight, cost, or defects; limit critical temperature, stress, vibration, or other responses; or maximize performance, reliability, agility, and robustness. The result: better designs and reduced dependence on prototypes and testing, which shortens design cycles and lowers development costs.

Sandia researchers have been developing DAKOTA for about eight years. Starting out as a Laboratory Directed Research and Development (LDRD) program, the initial work focused on optimization methods, but has since branched out into uncertainty quantification and other areas. It has been used internally by analysts from Centers 9100, 9200, 15200, 8700, 2100, 2300, and 2500 and externally by researchers at Los Alamos and Lawrence Livermore national laboratories in conjunction with DOE's ASCI program.

In addition, 15 industrial companies and universities have been granted DAKOTA licenses under the old licensing system.

Now this sophisticated and flexible engineering software will be made available free to virtually anyone who wants it without the hassles of custom licenses.

"Some commercial products exist that allow users to do optimization, but there are a variety of features that make DAKOTA unique -- like having the ability to use thousands of processors," Mike says.

Other unique features include support for surrogate-based optimization, optimization under uncertainty, mixed integer nonlinear programming, and simultaneous analysis and design, all of which are useful tools for real-world engineering applications.

There are several reasons for making DAKOTA readily available. It will encourage collaborations between Sandia, universities, and other research organizations. This will help infuse the latest research in optimization and related areas into DAKOTA.

Also, the public release will give the software more exposure and use. That is beneficial because as more people use the software, they will identify problems and contribute enhancements that can be shared with the user community. This expanded use could extend to commercial software companies as well -- several software vendors have expressed interest in using DAKOTA services along with their proprietary software systems.

"The only restriction is that people cannot take the DAKOTA software, change it, and then sell it," Mike says. "They can, however, design products with DAKOTA and sell their products."

The expected positive result will be an improved software toolkit that will be even more efficient and capable for supporting Sandia's mission.

DAKOTA is a flexible software that can be used both on massively parallel computing platforms that have thousands to tens of thousands of processors and on a single workstation or a network of workstations. It has been successfully ported to most common UNIX-based workstations including Sun, SGI, DEC, IBM, and LINUX-based PCs.

Technical manuals have been prepared to help walk DAKOTA users through the software. Also, the DAKOTA team has begun giving tutorial workshops and expects to do more as the software is publicly released.

Many engineers and researchers around the country are already awaiting the open source release. These "friendly customers" will be given the first access to the software and will serve as testing sites.

"We'll let them download it and help us work out the kinks in the new process," Mike says. "Then when we feel completely comfortable with it, we'll open it up for total public access."

Download information is available here , and the DAKOTA team can be contacted via e-mail at dakota@sandia.gov. -- Chris Burroughs

Back to topBack to Lab News home page.

Soil moisture sensor could enhance irrigation efficiency

Back to topBack to Lab News home page.

By Nancy Garcia

Two big-ticket items in California, agriculture and water, could both benefit from a soil moisture sensor invented by Ken Condreva of Telemetry Systems Engineering Dept. 8416.

Rainless for months at a stretch, the state carefully monitors and marshals out its water supply -- partially provided by runoff from snow melt.

Ken spent about a month this summer testing his new invention, which he hopes will help growers optimize irrigation efficiency. Ralph Boehmer and Danny Dominguez (both 8512) drilled a five-foot-deep hole to bury the sensor, which is about the size of two coffee cans placed end-to-end. Like an earlier sensor Ken created to measure water in the snowpack (now being commercialized by Canberra Industries), the device detects how water attenuates incoming cosmic radiation. A reference sensor mounted above ground measures the total incoming cosmic radiation (which fluctuates over time).

Ken says he'd thought about applying this approach to soil moisture and wondered if the attenuation of cosmic rays by the soil itself might decrease the sensitivity. In a couple of 10-day-long tests, he found that by averaging the signal over time, he could see differences in moisture amounting to as little as an inch or less of water, applied at the surface with a sprinkler.

He gathered his data on a computer in a small outbuilding provided by Herb Woelffer and Ed Diemer (both of 8511). He envisions the agricultural version will have a battery and transmit data to a collection point, possibly even controlling irrigation automatically.

Several systems exist that detect moisture at, or near, the surface of the soil. However, none have been developed to detect moisture this far down, which Ken hopes will be especially useful for crops with deep root systems in vineyards or orchards.

His summer research project was paid for with $2,500 from a royalty fund that Sandia collects and re-invests in R&D and related activities, such as promotion, training, and cost-sharing. The project has led to a recently filed patent application.

"There has been a long-felt need for a simple, inexpensive, reliable, and practical method for determining water content," says Sandia patent agent Tim Evans (11600). Ken says another advantage of the sensor is that it measures moisture over a relatively wide area, which is proportional to the depth of the buried sensor. At five feet deep, the sensor measures moisture from about a 10-foot-wide circle at the surface.

To study the feasibility of the device, Ken watered the surface with about six inches of water, applied with a garden sprinkler. Over the next few days, he observed as the water shielded incoming cosmic radiation, then slowly dried or percolated away, returning to the baseline reading.

Like the automated snowpack water sensor, he says, the new soil moisture device "can contribute to better water management."

This matters in relatively arid areas like parts of California that only receive 15 inches of rain a year (all between November and March). The state Department of Water Resources has a division devoted exclusively to water efficiency, and water-use planning is a high priority in the state's $25 billion annual agricultural industry. Grapes, typically grown in some of those drier regions, are the state's second-leading commodity, produced at more than 300,000 tons a year. Among orchard crops, meanwhile, almonds represent the state's leading export. -- Nancy Garcia

Back to topBack to Lab News home page.

Back to topBack to Lab News home page.

Last modified: Nov. 23, 2001


Back to the Lab News home page


Browse current and past Lab News articles

View Sandia news releases and fact sheets


Back to top of page

Questions and Comments || Acknowledgment and Disclaimer