3-D model more accurately pinpoints source of earthquakes, explosions
During the Cold War, US and international monitoring agencies could spot nuclear tests and focused on measuring their sizes. Today, they’re looking around the globe to pinpoint much smaller explosives tests.
Under the sponsorship of the NNSA’s Office of Defense Nuclear Nonproliferation R&D, Sandia and Los Alamos National Laboratory have partnered to develop a 3-D model of the Earth’s mantle and crust called SALSA3D, or Sandia-Los Alamos 3D. The purpose of this model is to assist the US Air Force and the international Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) in Vienna, Austria, more accurately locate all types of explosions.
The model uses a scalable triangular tessellation and seismic tomography to map the Earth’s “compressional wave seismic velocity,” a property of the rocks and other materials inside the Earth that indicates how quickly compressional waves travel through them and is one way to accurately locate seismic events, Sandia geophysicist Sandy Ballard (5736) says. Compressional waves — measured first after seismic events — move the particles in rocks and other materials minute distances backward and forward between the location of the event and the station detecting it.
SALSA3D also reduces the uncertainty in its predictions, an important feature for decision-makers who must take action when suspicious activity is detected, he adds.
“When you have an earthquake or nuclear explosion, not only do you need to know where it happened, but also how well you know that. That’s a difficult problem for these big 3-D models. It’s mainly a computational problem,” Sandy says. “The math is not so tough, just getting it done is hard, and we’ve accomplished that.”
A Sandia team has been writing and refining code for the model since 2007 and is now demonstrating SALSA3D is more accurate than current models. The team members include Andre Encarnacao, Jim Hipp, Brian Kraus, Ben Lawry, and Chris Young (all 5563), Eric Chael (5736), and Mike Begnaud at Los Alamos lab.
In recent tests, SALSA3D was able to predict the source of seismic events over a geographical area 26 percent smaller than the traditional one-dimensional model and 9 percent smaller than a recently developed Regional Seismic Travel Time (RSTT) model used in combination with the one-dimensional model.
GeoTess software release
Sandia recently released SALSA3D’s framework — or GeoTess, the triangular tessellated grid on which the model is built — to other Earth scientists, seismologists, and the public. By standardizing the framework, the seismological research community can more easily share models of the Earth’s structure and global monitoring agencies can better test different models. Both activities are hampered by the plethora of models available today, Sandy says (see box at right).
“GeoTess makes models compatible and standardizes everything,” he says. “This would really facilitate sharing of different models, if everyone agreed on it.”
When an explosion goes off, the energy travels through the Earth as waves that are picked up by seismometers at US and international ground monitoring stations associated with nuclear explosion monitoring organizations. Scientists use these signals to determine the location.
They first predict the time it takes for the waves to travel from their source through the Earth to each station. To calculate that, they have to know the seismic velocity of the Earth’s materials from the crust to the inner core, Sandy says.
“If you have material that has very high seismic velocity, the waves travel very quickly, but the energy travels less quickly through other kinds of materials, so it takes the signals longer to travel from the source to the receiver,” he says.
For the past 100 years, seismologists have predicted the travel time of seismic energy from source to receiver using one-dimensional models. These models, which are still widely used today, account only for radial variations in seismic velocity and ignore variations in geographic directions. They yield seismic event locations that are reasonably accurate, but not nearly as precise as locations calculated with high fidelity 3-D models.
Modern 3-D models of the Earth, like SALSA3D, account for distortions of the seismic wavefronts caused by minor lateral differences in the properties of rocks and other materials.
For example, waves are distorted when they move through a geological feature called a subduction zone, such as the one beneath the west coast of South America where one tectonic plate under the Pacific Ocean is diving underneath the Andes Mountains. This happens at about the rate at which fingernails grow, but, geologically speaking, that’s fast, Sandy says.
One-dimensional models, like the widely used ak135 developed in the 1990s, are good at predicting the travel time of waves when the distance from the source to the receiver is large because these waves spend most of their time traveling through the deepest, most homogenous parts of the Earth. They don’t do so well at predicting travel time to nearby events where the waves spend most of their time in the Earth’s crust or the shallowest parts of the mantle, both of which contain a larger variety of materials than the lower mantle and the Earth’s core.
RSTT, a previous model developed jointly by Sandia, Los Alamos, and Lawrence Livermore national laboratories, tried to solve that problem and works best at ranges of about 60-1,200 miles (100-2,000 kilometers).
Still, “the biggest errors we get are close to the surface of the Earth. That’s where the most variability in materials is,” Sandy says.
Seismic tomography gives SALSA3D accuracy
Today, Earth scientists are mapping three dimensions: the radius, latitude, and longitude.
Anyone who’s studied a globe or world atlas knows that the traditional grid of longitudinal and latitudinal lines work all right the closer you are to the equator, but at the poles, the lines are too close together. For nuclear explosion monitoring, Earth models must accurately characterize polar regions even though they are remote because seismic waves travel under them, Sandy says.
Triangular tessellation solves that with nodes, or intersections of triangles, that can be accurately modeled even at the poles. The triangles can be smaller where more detail is needed and larger in areas that require less detail, like the oceans. Plus the model extends into the Earth like columns of stacked pieces of pie without the rounded crust edges.
The way Sandia calculates the seismic velocities uses the same math used to detect a tumor in an MRI, except on a global, rather than a human, scale.
Sandia uses historical data from 118,000 earthquakes and 13,000 current and former monitoring stations worldwide collected by Los Alamos lab’s Ground Truth catalog.
“We apply a process called seismic tomography where we take millions of observed travel times and invert them for the seismic velocities that would
create that data set. It’s mathematically similar to doing linear regression, but on steroids,” Sandy says. Linear regression is a simple mathematical way to model the relationship between a known variable and one or more unknown variables. Because the Sandia team models hundreds of thousands of unknown variables, they apply a mathematical method called least squares to minimize the discrepancies between the data from previous seismic events and the predictions.
With 10 million data points, Sandia uses a distributed computing network with about 400 core processors to characterize the seismic velocity at every node.
Monitoring agencies could use SALSA3D to precompute the travel time from each station in their network to every point on Earth. When it comes time to compute the location of a new seismic event in real-time, source-to-receiver travel times can be computed in a millisecond and pinpoint the energy’s source in about a second, he says.
Uncertainty modeling a SALSA3D feature
But no model is perfect, so Sandia has developed a way to measure the uncertainty in each prediction SALSA3D makes, based on uncertainty in the velocity at each node and how that uncertainty affects the travel time prediction of each wave from a seismic event to each monitoring station.
SALSA3D estimates for the users at monitoring stations the most likely location of a seismic event and the amount of uncertainty in the answer to help inform their decisions.
International test ban treaties require that on-site inspections can only occur within a 1,000-square-kilometer (385-square-mile) area surrounding a suspected nuclear test site. Today, 3-D Earth models like SALSA3D are helping to meet and often significantly exceed this threshold in most parts of the world.
“It’s extremely difficult to do because the problem is so large,” Sandy says. “But we’ve got to know it within 1,000 square kilometers or they might search in the wrong place.”-- Heather Clark
Team gears up for hurricane season
A Sandia team is gearing up for hurricane season, readying analyses to help people in the eye of a storm.
The Department of Homeland Security’s National Infrastructure Simulation and Analysis Center (NISAC), jointly housed at Sandia and Los Alamos national laboratories, studies the interdependency and vulnerability of critical infrastructure and the consequences of having systems disrupted by disasters, including hurricanes.
Hurricane season began June 1 and runs through Nov. 30. It generally peaks in August and September, notwithstanding Superstorm Sandy’s appearance late last October.
NISAC has two jobs relating to hurricanes: conducting annual “hurricane swath” analyses of probable impacts on the Gulf Coast and East Coast and providing quick analyses of crisis response in the face of an imminent hurricane threat to the United States.
Analyses allow preliminary look at storm
A swath analysis looks at how a hurricane might interrupt electricity or water service and at impacts specific to an area, such as petroleum and petrochemical industries in Houston or financial services in New York City. It also looks at such things as the economic impact of the storm or how it could upset food deliveries.
Federal officials pull swath analyses off the shelf when a hurricane seems likely to hit a particular place. They used the New Orleans report a few days before Hurricane Isaac headed toward that city last August.
“While it was too far out for us to do our analysis, they could use the report as a first cut,” says Dan Pless (6924), NISAC program lead at Sandia.
NISAC’s portfolio includes a dozen swath analyses updated every few years, two cities at a time. A team coordinated by Mark Pepple (6632), NISAC fast response lead, this year updated reports for Houston and Corpus Christi, Texas; last year the work focused on Miami and Tampa. Updates keep information from becoming too stale, Dan says.
NISAC decided what to put into the original analyses, but is working on updates with state and local officials and Department of Homeland Security (DHS) agencies, including the Federal Emergency Management Agency (FEMA).
Reports analyze ‘reasonable bad scenario’
Each report uses a “reasonable bad scenario” that would be possible in the particular area, with local officials deciding what scenario would be most useful for disaster planning, say Dan and Mark. For example, a Category 5 hurricane isn’t likely in New York City because colder waters dampen hurricane strength, but a Category 3 is within reason.
“These storms form in the Caribbean, they form in the Gulf. They can get quite strong down there,” Dan says. “They don’t form in the North Atlantic. They have to travel there.”
The analyses — also useful in other natural disasters — consider impacts to the infrastructure, the population, and the economy, Dan says.
“We look at where power outages are likely,” he says. “For Houston, it would examine the possible national impact on petroleum supplies and whether we should worry about that.”
They look at so-called food deserts: urban areas where food deliveries might be interrupted, he says.
NISAC also has found that some local officials want more demographic information. Officials in Florida, with its high retiree population, want to know where the elderly are concentrated, Dan says.
The most difficult part of an analysis is defining a scenario because every place is different and a wide range of agencies must reach consensus, he says.
Team activated for big hurricanes
Once NISAC is activated, the team focuses on exactly what’s in the storm’s projected path.
“Anytime a hurricane is going to make landfall in the US we’re busy at some level. If it’s going to be a Category 3 or higher you can pretty much figure we’re going to go to full activation,” Dan says. The decision whether to activate and to what degree comes from NISAC’s program manager at the DHS, the Homeland Infrastructure Threat and Risk Analysis Center, known as HITRAC, part of the Office of Infrastructure Protection.
Mark helps lead NISAC’s crisis response. When federal officials activate a team, he coordinates with HITRAC and Sandia’s partners at Los Alamos, which has its own team doing analyses. The labs collaborate. For example, Los Alamos models and analyzes the impacts to electricity and metropolitan water systems, and Sandia uses those results to look at impacts to energy such as petroleum and natural gas or sectors such as transportation and banking.
He’s also responsible for getting Sandia’s team together, not just pulling in people, but identifying what expertise or simulation tools are needed. While a crisis response team always needs at least one economist to assess economic impact, a hurricane in Houston would require more analyses of the petrochemical sector than a hurricane in North Carolina, where agriculture could be a larger concern.
NISAC and HITRAC collaborate on how much time the team has before it locks in a prediction of the hurricane’s track toward land. The National Oceanic and Atmospheric Administration issues regular landfall projections. At some point, NISAC has to lock in a storm track, or prediction, on which to base its analyses.
Dan says the amount of time for analysis is shrinking. NISAC had 48 hours for Hurricane Gustav, which hit the South in late August and early September 2008.
“They said that’s too much time, the track can change too much in that time,” Dan says.
The team had 24 hours to do its analysis for Hurricane Ike, which hit the Texas, Louisiana, and Mississippi coasts in September 2008. By the time Irene hit the East Coast in August 2011, the deadline had dropped to 12 hours. “We’re roughly around 10 to 11 hours at this point,” Dan says.
The team provides similar information as for a swath analysis, with less detail but using the hurricane’s strength and what’s in its path. While the report generally focuses on the projected track, sometimes the team adds a caveat that damage could be worse if the storm changes path.
Questions spike when hurricane hits
The team also responds to a flurry of questions from DHS just before landfall. For Ike, Dan says, officials wanted to know which large Houston-area water treatment plants were most likely to lose power and could use one of three available FEMA generators.
For Sandy, NISAC’s report identified subways in the storm surge zone and did some power outage modeling.
Questions usually spike after a hurricane hits. That was particularly true for Sandy.
“You had this massive power outage and they were wondering, ‘OK, we have these cell towers and a lot of them have diesel generators for backup. Those last 48 to 72 hours and the power isn’t coming back in 48 to 72 hours. How do we prioritize that? Few of the gas stations have fuel, what’s going on? Is it that they don’t have power or because eight of the nine fuel delivery terminals in New Jersey were down?’” Dan recalls.
Sandy reversed the normal workload.
“Usually we have a lot heavier workload going into the hurricane before landfall and generally have tired people and a lighter workload afterward. On Sandy, we worked the opposite. We had a relatively light workload going in and then it got really busy,” Dan says. “That was because it was that weird perfect storm.”-- Sue Major Holmes
Ready-to-sign licenses put Sandia IP on fast track
by Nancy Salem
See also "What's a hedgehog" below
Sandia is building a portfolio of intellectual property (IP) that can be licensed by businesses in as little as an hour.
“This is the simplest process possible,” says business development specialist Bob Westervelt (7932), who helped put together the ready-to-sign licensing program, which can be accessed by businesses on the Sandia website. “The language is clear and easy to understand. We can say, ‘Here’s the license, here are the terms. Once you and Sandia have signed it, you can start using the intellectual property.’”
The goal is to get more Sandia IP into the hands of small businesses and entrepreneurs. Sandia has about 1,300 patents available for licensing. Bob says the Licensing group, which works with companies of all sizes, noticed that smaller ones often find the number of patents to search through and the complexity of licensing daunting.
About a year ago the group came up with the idea of creating a standard license for certain IP they identify as being desirable. “We look through the IP for technologies we’re surprised aren’t being used, that need more visibility, and that still have a lot of time left on the patent,” Bob says.
Small businesses might not have the time or manpower to sift through 1,300 patents to see if Sandia has something that might help them be more successful. “If we give them a shorter list and simplify the process good opportunities are more likely to be noticed,” Bob says.
The ready-to-sign license is uncomplicated, with simplified language and pared-down terms, conditions, and reporting requirements. And it’s lower cost. “We are offering relatively small up-front fees, in the $3,000 range, and low-percentage royalties,” Bob says. “We don’t want to impose a financial burden on a small business that needs cash flow.”
More ready-to-sign licenses in the pipeline
So far eight patents fall under the program, from a drive system for industrial applications that require high torque and low rpm to a compact spectrometer that can detect trace amounts of gases such as carbon monoxide and methane to a vehicle barrier that holds up to a powerful impact. More ready-to-sign licenses are in the pipeline and Bob says the group hopes to assemble a portfolio of about 50.
“We want a manageable number that can have the most impact,” he says. “These are all technologies that no one has licensed in areas where small businesses might be able to get a foothold. A small company could take any of these licenses and run with it.”
The licenses are non-exclusive, so any number of companies can make use of a technology. “It’s not first-come first-served for the IP,” Bob says. “If five companies are interested in a technology, all five can license it.”
One has been signed so far. Advance Plumbing of Albuquerque licensed the Labs’ Hedgehog water-purification technology (see below). Company President Vincent Sanchez says he signed on because of the simplicity of the licensing process.
“This was really easy to get into,” he says. “I would not have looked at it if finding the technology and doing the agreement was a big process requiring certification and lots of financials and other reporting. I’m not in a position to do all that. If it’s easy, I can say, ‘Why not? Let’s take a look.’”
Pete Atherton, senior manager of Industry Partnerships Dept. 7930, says Sandia is always looking for better ways to transfer technology for the public good. “The ready-to-sign program is a new component of our initiative to make licensing Sandia’s technology easier and faster,” he says.
Ready-to-sign licenses are listed on the Intellectual Property website at https://ip.sandia.gov/readyToSignLicenses.xhtml. A business owner can click on a technology to get information and download a PDF file with the paperwork. “You click on one link and it downloads everything you need,” Bob says.
He says the Licensing group is looking for IP that would fit the ready-to-sign program. “We would like Sandians to suggest ideas,” he says. “You might know of a patented technology that would have broader uses. There is so much good IP out there.”
What’s a Hedgehog
The Hedgehog water purification system was developed at Sandia after the US Environmental Protection Agency in 2006 lowered the allowable amount of arsenic in drinking water from 50 parts per billion to 10. New Mexico has about 100 communities with arsenic contamination in ground water.
“It wasn’t a problem until the standard was revised, then all of a sudden a bunch of communities by law did have a problem,” says Pat Brady (6910), lead researcher on the Hedgehog, the first Sandia technology to be licensed under the new ready-to-sign licensing program. “A lot of them were at around 15 parts per billion, just within sight of being in the clear.”
Pat says a city the size of Albuquerque can afford the infrastructure and manpower to remove arsenic from water, which can cost into the hundreds of thousands of dollars. So Sandia focused on how small communities that get their water from the ground can affordably reduce arsenic. “We did an analysis that found the big costs are the infrastructure — building filter galleries — and operators,” he says. “The Hedgehog is an attempt to solve the arsenic problem with no infrastructure and no people.”
Hedgehog is a submersible recirculation pump with an attached filter bed that sits in the bottom of a water tower. It runs continuously, grabbing arsenic and other organic and inorganic contaminants as water runs through the filter. The Sandia-designed filter is replaced every few months.
“It’s a simple idea,” Pat says. “What made it cool for small communities is that most of them pump water from a well into a tower and add chlorine. The water flows by gravity into pipes, so the only infrastructure to work with is a well or water tower. By putting Hedgehog into the tower there’s no need to build big filter galleries that have to be watched all the time. It works with what they have.”
Pat says the Hedgehog, licensed by Advance Plumbing of Albuquerque, was designed specifically for small communities with arsenic-contaminated water. “This is the least expensive way to fix it,” he says. “It gets the count down to 10 parts per billion and gets the towns out of trouble.”-- Nancy Salem