On Saturday, Sept. 23, a joint team from Sandia, Kirtland Air Force Base (KAFB), and the NNSAís Sandia Site Office (SSO) successfully destroyed a Spartan rocket motor that had become unsafe for future use.
More than a year of planning helped ensure the operation went off without a hitch.
Steven Yesner (5020) tells the story of the Spartanís destruction from his perspective as the Sandia project lead.
Click on the image above to open a the PDF file that tells the story. (Adobe Acrobat Reader required.)
By Bill Murphy
Secretary of Energy Samuel Bodman traveled to Sandia last week to announce that the department has selected the Labs to be the new home of the National Laboratory Center for Solid-State Lighting Research and Development. In that capacity, Sandia will conduct vital solid-state lighting research and coordinate related research efforts at several other national laboratories.
DOE will provide funding of $5 million for seven research projects in solid-state lighting, including $2.6 million for four Sandia projects, Bodman said. The funding comes from DOE’s Office of Energy Efficiency and Renewable Energy.
“The research will be conducted at the new nanotechnology centers at our national laboratories,” he said, including the just dedicated Sandia/Los Alamos Center for Integrated Nanotechnologies. “This is part of nearly $20 million we are committing this year to support research and development efforts in this rapidly emerging technology.”
Bodman’s remarks came during an Oct. 5 news conference at Sandia’s International Programs Building. Also in attendance were NNSA Administrator Linton Brooks, Labs Director Tom Hunter (who served as host), Sens. Pete Domenici, R-N.M., and Jeff Bingaman, D-N.M, Rep. Heather Wilson, R-N.M., and LANL Director Michael Anastasio.
Before the announcement, the entourage got a tour and briefing about the CINT facility and its capabilities. On Oct. 4, Bodman addressed an all-hands meeting of Sandians at the Steve Schiff Auditorium. (See “Safety remains a top priority, Secretary Bodman tells Sandians at all-hands meeting ” on page 4.)
Bodman laid out the case for investment in solid-state lighting R&D, noting that 18 percent of all US energy generated — valued at some $55 billion — goes to lighting homes, offices, and factories.
“We believe a set of revolutionary new technologies called solid-state lighting,” Bodman said, “offer excellent prospects for meeting our future lighting needs in a less costly, more efficient way than today’s incandescent and even fluorescent fixtures. . . we at the Department of Energy want to see it fully developed as quickly as possible.
“We also believe that solid-state lighting presents an excellent opportunity for the US to assume a leadership role in an emerging industry that can generate thousands of high-paying, high-quality jobs in the years ahead and help maintain the US economy’s strong record of global leadership in growth and jobs creation.”
Regarding jobs creation, Bingaman in prepared remarks said he hoped the tens of thousands of high-paying jobs that will emerge in this budding industry in the next few years will stay in the US — and in New Mexico — rather than go overseas. Bingaman, an early and ardent supporter of investment in solid-state lighting R&D, said in a news release distributed in conjunction with Bodman’s announcement, “It makes sense that this research will be done right here at Sandia, where scientists are already hard at work developing this technology.”
During the news conference, Bingaman singled out Jerry Simmons, senior manager of Energy Sciences Dept. 1130, for his “many years of working in the vineyards” of solid-state lighting research.
Domenici praised DOE for making the investment in solid-state lighting research. He said there are many ways the nation can address the issue of the overconsumption of crude oil from overseas. He said the investment in solid-state lighting R&D is a tangible demonstration that the current administration cares about the issue and is funding research that can actually help reduce energy use.
Domenici predicted a bright future for solid-state lighting. “There will be spin-offs” from this research, he said, and the labs involved will find “the longest waiting list they’ve ever seen” for partners from industry and academia eager to collaborate on research projects. “It is going to be rather exciting to watch,” he said.
Wilson noted that at just one intersection in Albuquerque where the traffic signals have been changed to use LED lights the city saves about $1,000 a year.
Bodman used the occasion of the news conference to announce that DOE is renewing its “Change a Light, Change the World” challenge to Americans to reduce energy consumption by changing out at least one incandescent light bulb in their homes for a more efficient Energy Star-rated alternative.
“This is something simple that every American can do that will not only reduce their utility bills but the nation’s overall use of electricity,” Bodman said.
“If every household in the US replaced the five light bulbs they use the most with Energy Star models, they would save $60 a year on their utility bills and the nation as a whole would save $6 billion in energy costs.”
-- Bill Murphy
Eight Sandia computational science projects have been awarded a total of $2.9 million annually over the next five years by the Office of Advanced Scientific Computing Research (ASCR) within the DOE Office of Science. The announcement of the awards was made last month after a competitive, peer-reviewed proposal process.
The Office of Science’s “Scientific Discovery through Advanced Computing” (SciDAC) program is making the funding available to 30 projects of which Sandia is involved in eight. Participating in the 30 projects are 70 institutional partners and hundreds of researchers and students.
All of the projects involve several partners and large-scale collaborations. The projects of which Sandia is a part all entail large-scale computer simulations aimed at accelerating research in a wide range of areas including the design of new materials, developing future energy sources, studying global climate change, and understanding physics from the tiniest particles to the massive explosions of supernovae.
“Among the reasons the Sandia projects were awarded funding is our unique experience using high-performance computers,” says Scott Collis (1414), point of contact for Sandia’s ASCR research. “Our ongoing work in both designing and using state-of-the-art supercomputers such as Red Storm and Thunderbird [computers] has provided us expertise in supercomputing that is respected around the country.”
He adds that “this expertise cross cuts Sandia sites in New Mexico and California as does the SciDAC funding.”
SciDAC computational work will be done on new DOE petascale computers that are planned to go into operation at Oak Ridge and Argonne national laboratories by the end of the decade. Petascale computing refers to petaflops, a million billion calculations per second, and petabytes, a million billion bytes of data. This level of computing power will enable researchers to study scientific problems at an unprecedented level of detail. For example, current models allow scientists to design materials with thousands of atoms, while petascale computing will allow models with millions of atoms, yielding more accurate simulations that will promote fundamental scientific discovery.
Sandia projects awarded funding for the petascale computing include:
Scott says an important aspect of these projects is that they will allow Sandia to develop even more collaborations in the high-computing world both in the DOE laboratory complex and throughout academia.
“We’ll gain additional experience and capability in using supercomputers that will have impact far beyond the individual SciDAC projects,” he says. “At the same time, we will be able to pursue cutting-edge collaborative science in a wide range of areas. And the most exciting aspect of this funding is that it will result in new discoveries that we can’t yet predict.”
Mankind is making a tremendous impact on the atmosphere by burning fossil fuels at unprecedented levels, causing carbon dioxide to spew into the air, says Mark Taylor (1433), who is heading up a Sandia project to model climate change. Most scientists believe that these emissions are causing the world to warm to dangerous levels. How that will affect the Earth’s future remains to be seen.
Mark and Bill Spotz (both 1433) will spend the next five years trying to figure out what the future might hold for the Earth’s atmosphere by collaborating on a large-scale climate model. They were recently awarded $300,000 a year over the next five years as part of the DOE Office of Science’s Scientific Discovery through Advanced Computing (SciDAC) program.
Bill and Mark will be working with Oak Ridge National Laboratory and the National Center for Atmospheric Research (NCAR) using a petascale computer at Oak Ridge to extend the capabilities of an atmospheric model. Other participating institutions will work on different climate components, including the ocean, sea ice, and land use/land cover change that together make up the Community Climate System Model (CCSM).
“After the first three years we expect to have an atmospheric model operating on a petascale platform, which means it will have the capability of running a million billion calculations per second,” Mark says. “And at the end of five years we hope to be modeling the carbon cycle at high-fidelity [extremely accurate] rates.”
The CCSM already simulates dynamics, thermodynamics, moisture, chemical, and aerosol processes. Mark and Bill anticipate that over the next five years they can help make this model more efficient on large computers while other researchers will make it more comprehensive by adding biological, ecological and other processes.
Eventually, the new petascale CCSM will be used to model various atmospheric scenarios. The first would be business as usual — no change in how the world burns fossil fuels. Other models would determine what the world would look like if the burning of fossil fuels is reduced at varying rates.
“We don’t know the answer right now, due to the many complex feedbacks that the petascale model will help us figure out,” Mark says. “For example, plants absorb carbon dioxide through the photosynthesis process, but then release it back to the atmosphere when they decay. What is the net effect of a warming world on the type and amount of vegetation and its ability to absorb carbon dioxide?” -- Chris Burroughs
By Nancy Garcia
Although fire has been successfully exploited for millennia, a mastery of combustion is more critical today than it has ever been. A deep understanding of the science of combustion is vital to maximizing the efficiency of current and future energy production.
New tools, including lasers and computers, have greatly expanded scientists’ ability to unveil new knowledge about the complex phenomena occurring over space and time within a burning flame. Now, additional ways to share that information and those tools with colleagues are ushering in a new era for how combustion science is conducted.
A $10 million effort that involved researchers from nine organizations over five years has created an online prototype portal for data sharing. Its creators’ vision is to speed access to richly detailed observations and simulations, rather than requiring scientists to wait for conferences or journal publications to learn about new results. They can then more quickly build upon what their peers have uncovered.
The team called the project and the online portal the Collaboratory for Multi-scale Chemical Science (CMCS, http://cmcs.org). The term “collaboratory” was coined about 15 years ago to describe a center without walls, where geographically separated collaborators might share information, analysis tools, or applications through high-performance networks.
The idea continues to grow
Nils Hansen’s (8353) research group is part of an international team that is working to identify chemical species in low-pressure flames probed with light from the Advanced Light Source at Lawrence Berkeley National Laboratory. The data are automatically captured and can be analyzed and graphically displayed in a number of ways using CMCS.
“We have so much data that it’s sometimes hard to keep up,” he says. “We thought it would be nice to have this data as part of the user facility and make it widely available to the user community.”
The CMCS work builds upon an earlier effort at the Combustion Research Facility. Software tools were developed for the Diesel Combustion Collaboratory from 1997 to 2000. Larry Rahn (8350) was involved from the start.
“Our plan was to take that to the next level,” Larry says, “so that it would be broader than a particular community and span all chemical science related to combustion. It’s a great vision driven by the broad spectrum of physical scales in combustion, and it could have an impact on the nation’s energy concerns.
“It’s a challenge for scientists to make this information flow better across these scales and their related disciplines. If they can, it channels research into challenges a neighboring discipline faces, and promotes research that is more useful toward the ultimate mission. It provides additional tools and approaches to facilitate a strong coupling among interdisciplinary scientists, allowing them to pursue research in a ‘systems science’ approach.”
The CMCS team included scientists to set requirements and explore prototype tools, as well as researchers to develop and build the infrastructure. Christine Yang (8116, formerly 8964) has been a coprincipal investigator and contributor to this and related projects, creating software frameworks the team members call knowledge grids.
She says scientists sometimes have a surfeit of data. For instance, Jackie Chen (8351) and her postdoctoral researchers perform terascale simulations of turbulent combustion to investigate fundamental turbulence-chemistry interactions in flames. Jackie has employed CMCS prototype software, FDTools developed by Wendy Doyle (8963), to track in space and time localized regions where fluid parcels auto-ignite. Recent simulations in three dimensions have produced tens of terabytes of data. Feature detection and tracking of salient regions of interest would greatly facilitate interpretation of large sets of simulated data.
A treasure trove of data
Rob Barlow (8351) also produces what collaboratory researcher David Leahy (a former Sandian now at Stanford University) calls a “treasure trove of data” through six-laser experiments that measure many complicated flame properties simultaneously.
Christine says that normally researchers work on a cycle of two to five years in which they produce and evaluate information to be presented in peer-reviewed papers and at conferences. She believes they see value in speeding up that process through collaboratories. Doing so takes advance coordination, both in agreeing upon standards, and preparing data by adding information that facilitates its use by others.
“In the past,” adds Leahy, “scientists didn’t think about sharing the data they got in the lab. First, they didn’t have the Internet. Second, they weren’t used to the process of sharing data — they just obtained it and decided what they thought it meant and then shared that, the very tippy-top of the iceberg, the top one percent. The primary goal had been to successfully get peer-reviewed publications. That was the measuring stick for success.”
Larry suggests that another yardstick may emerge, perhaps by counting data downloads in addition to publications and citations. “There’s kind of a data revolution happening in science,” he says, where data are becoming recognized as an end product in their own right.
A new measure of success
Leahy agrees, saying, “If a biologist figures out a structure of a protein and it is downloaded 1,000 times, that could be a new measure of success, a feather in the cap of the biologist.”
He was coprincipal investigator with Carmen Pancerella (8964) on an ongoing project funded by the National Institutes of Health and National Science Foundation, the Data Portal Enabling New Protein Structure Collaboration (Collaboratory for MS3D, http://ms3d.org/), now in its third year.
This project has benefited by using open- source software developed by the CMCS project team. The software framework is called the Knowledge Environment for Collaborative Science (KnECS) and is an enabling glue that knits together software collaboration tools, scientific applications, and data management tools. The framework allows team creation, workflow management, and subscriptions. The primary interface is a web portal enhanced with such interactive features as chat and announcements. The environment integrates data management software, most notably Scientific Annotation Middleware (SAM), for encoding, storing, searching, and controlling access to data. The data are tagged with information through creating metadata in XML, which can be viewed on any computer platform.
“The idea is to enable small research communities to tackle innovative ideas by using a shared, integrating infrastructure,” says Larry. An advantage is that the data will be accessible for years, and not become outmoded, like, for example, the magnetic tapes or even chart recordings of the past. He says the application to combustion was important since it is involved in 85 percent of global energy use.
There are also other related collaboratory efforts underway at Sandia. Outside Sandia, researchers who share access to rare instruments, such as telescopes, or produce vast amounts of genomic data, are already far along the path of data-sharing.
“It’s just a matter of time before the ideas are adopted broadly through all corners of the scientific community,” Leahy predicts.
Toward that end, KnECS is being released on SourceForge.net as open-source software for further development. The hope is that enhancements will be added for use in a variety of research communities, Christine says, and that it one day may be adapted for commercial release. -- Nancy Garcia