skip to: onlinetools | mainnavigation | content | footer

Newsroom

SANDIA LAB NEWS

Lab News --August 27, 2010

August 27, 2010

LabNews08/27/2010PDF (2.3 Mb)

RapTOR seeks to quantify ‘unknown unknowns’

By Patti Koning

See also: RapTOR for algae - Understanding pond collapse

The world of bioterrorism is filled with scary stuff — anthrax, smallpox, and ricin, to name a few pathogens. And those are just the agents we know about. There is also a whole other realm of “unknown unknowns,” lethal agents that could be weaponized from ordinary viruses or disguised to look harmless.


Stan Langevin (8623) optimizes DNA hybridization conditions to suppress high abundant host DNA using time-consuming standard benchtop methods. Microfluidic-based normalization will eliminate tedious benchtop protocols combining DNA hybridization and hydroxyapatite chromatography on a single platform. (Photo by Randy Wong)


The RapTOR (Rapid Threat Organism Recognition) Grand Challenge seeks to solve the “unknown unknowns” problem by developing a tool to rapidly characterize a biological organism with no pre-existing knowledge.

“We’ve been thinking about this threat space for years,” says Todd Lane (8623), the project’s principal investigator. “Taking advantage of rapidly evolving molecular biology technology and the advent of ultra-high-throughput DNA sequencing, we are re-engineering time-intensive benchtop methods to be faster, easier, and automated.”

Todd divides the biological threat spectrum into three categories: traditional agents such as anthrax; enhanced agents that have been genetically manipulated for increased virulence, drug resistance, or to evade detection; and advanced agents — the unknown unknowns. “An advanced agent could start with a benign organism that we’d have no interest in from a national security perspective and manipulate it into a virulent pathogen that is difficult to detect with current systems,” he says.

Unlike known threats such as anthrax or smallpox, detection of an advanced agent would only occur when people begin showing symptoms. Every day that treatment is delayed, the lethality of the attack goes up exponentially. “If a novel attack occurs and our detection systems fail, we have limited time in which to identify and characterize the organism to be able to offer effective treatment,” says Todd.

History shows that identifying and characterizing a naturally occurring unknown organism is very difficult. The 1970s outbreak of Legionnaires’ disease took six months to characterize; nearly 30 years later, it still took weeks to characterize Severe Acute Respiratory Syndrome (SARS). Conventional DNA sequence-based detection systems failed to identify a recent outbreak of Ebola in Uganda because the virus had changed so much it was unrecognizable.

Lowering the bar to bioweapons

The same advances that make the RapTOR concept feasible also have lowered the technical bar for creating a bioweapon. “The research I did in graduate school for my dissertation is now being taught in high school,” says Todd. “It is now possible to completely synthesize bacterial genomes. Bioweapons have become a potentially low-cost weapon of mass destruction and it’s a very risky situation.”

Sequencing the human genome took 10 years and hundreds of millions of dollars. With ultra-high-throughput sequencing, that same work can be accomplished for about $10,000 in a week or less.

But ultra-high-throughout sequencing only addresses part of the problem. As Todd explains, in an outbreak scenario there would be a large number of samples from people manifesting symptoms of the disease and the worried well.

“The more samples you can sequence, the better chance you have at identifying and then characterizing the organism. But sequencing a clinical sample provides a lot of information that is not of interest,” he says.

For example, 99 percent of the DNA in a blood sample is the human genome. DNA in a nasal swab is 90 percent human-derived and much of the rest is garden-variety bacteria. “You need to quickly eliminate the ‘human flora’ before sending a sample to ultra-high-throughout sequencing,” says Todd. “We aren’t exactly looking for a needle in a haystack — we’re looking for multiple needles and each one is different.”

A better analogy is a jumble of 100 different disassembled pocket watches, with one of the original pocket watches representing a pathogen. “You have to sort through the watch parts to identify and discard everything you recognize. You have to simplify the mixture so there are more parts of significance to the pathogen,” he explains. “It’s not enough to just identify those parts that are unique — you also have to reassemble the pathogen watch.”

Molecular biologists employ a number of methods to prepare samples for ultra-high-throughput sequencing; the challenge for the RapTOR project is adapting those methods to a portable, automated platform. Todd explains that these methods require days of work by a highly trained scientist on the bench.

“Our overall goal is a 24-hour turnaround. An end user would inject a sample of blood into the system, which then runs the sample through a number of molecular biology manipulations and sends it off to a DNA sequencer,” says Todd. “The sequencer provides enriched information that the user sorts through to identify and characterize the sample.”

The team already has succeeded in adapting one method to a microfluidic platform: normalization, which removes high-abundance genetic material from a sample, leaving a small, representative amount of all of the genetic material found in a sample. Currently, normalization is performed using an enzymatic digestion process that relies on enzymes from Siberian Kamchatka crabs.

A faster, cheaper, and simpler method

To re-engineer normalization, the researchers looked back 20 years to hydroxyapatite chromatography, a resin-based method that was discarded because it was difficult to reproduce. “Modern resins are commercially available and well-defined, so hydroxyapatite chromatography is relevant again,” says Todd. “We created a capillary-based system to perform hydroxyapatite chromatography. It’s a faster, cheaper, and simpler method that doesn’t destroy the material in the process.”

In the traditional, benchtop normalization process, double-helix DNA is heated to separate the two strands. As the DNA cools, the genes that are expressed in high abundance will find their partner strands more quickly than those expressed in low abundance. A researcher stops the cooling sequence, adds the crab enzyme to remove the double-stranded DNA, and with some additional manipulation, the resulting DNA all appears in low abundance.

Hydroxyapatite chromatography follows the same process, substituting a phosphate buffer for the crab enzyme to remove the double-stranded DNA without destroying any of it. The entire process can be automated.

The RapTOR normalization method is now being tested against human clinical samples for “fevers of unknown origin” that did not develop beyond mild sickness. “These are outbreaks that get ignored because they are self-limiting, but the samples are a perfect test of our system,” says Todd. “If we can handle a small outbreak, we can handle something larger.”

The RapTOR normalization method has caught the attention of sequencing companies and government agencies that Todd says are very interested in helping bring such a device to market. He expects a prototype to be ready in the fall. 

RapTOR must do more than just normalize samples, so the researchers are turning their attention to other DNA manipulation methods such as ligation (linking two small pieces), digestion (cutting a larger piece into fragments), and size-based separations. Adapting some of these methods will prove simpler than others. Sandia’s microfluidic platform was developed for proteins, which are far more complex than DNA samples.

The LDRD project brings together a broad distribution of technical disciplines, unusual for a Grand Challenge but well-suited for Sandia’s capabilities. “Everyone has to work together, the microfluidicists, the microbiologists, and the bioinformaticists,” says Todd. “As we process samples, the data analysis and knowledge discovery group will provide feedback on the quality of the information being produced. The upstream process is tunable.”

RapTOR is designed to be a public health tool applicable for day-to-day work. “If you develop a tool like this, you need to regularly apply it to real-world scenarios. You can’t put it in a glass box and wait for the horrible attack,” says Todd.

The tool will have other applications in other fields as well, such as environmental detection. Todd explains that RapTOR could be used to take regular atmospheric samples and analyze their genetic makeup to develop a baseline. This application has garnered interest from the Defense Threat Reduction Agency, DoD, and DHS.

RapTOR will never eliminate the problem of unknown unknowns, but it will make the path from unknown to known faster and simpler, says Todd.

RapTOR for algae: Understanding pond collapse

Last month, Todd Lane (8623), Jeri Timlin (8622), and Ben Wu (8125) received $800,000 in funding over two years from the DOE Biomass Program for their proposal “Pond Crash Forensics.” Using pathogen detection and characterization technologies developed under the RapTOR Grand Challenge, they will compare the environmental conditions and metagenomes of algal samples taken from normal ponds to those taken from ponds that have undergone collapse.

Algae are widely viewed as a potential source of renewable fuel, but the technology to mass-produce fuel-grade algae is still in the early stages. A major roadblock, says Todd, is the inability to produce large amounts of algae.

Algae are commonly grown in raceway ponds, large, shallow, artificial ponds that serve as fields for algae crops. “Pathogens and viruses fall into these ponds and can crash a pond overnight,” says Todd. “No one has identified many of the agents that are causing these pond crashes. You can’t develop countermeasures without understanding why something is happening. This is a complex problem with a lot of factors at play.”

He adds that this is a mostly unexplored area because growing algae is closer to farming than biotechnology. “This is a good application for RapTOR because, like clinical blood samples, there is a lot of naturally occurring stuff to sort through before you can find the pathogen or virus,” says Todd. “It’s a really good niche for Sandia, to provide a service that will be of great benefit to the algal biofuel industry that will in turn greatly benefit the nation.” -- Patti Koning

Top of page
Return to Lab News home page


DHS/S&T, Sandia developing new technical capability for emergency exercises

By Mike Janes

 The Department of Homeland Security’s Science and Technology Directorate (DHS/S&T), working with researchers at Sandia, is developing a new software architecture that will help emergency incident planners and responders from around the country more effectively use and integrate advanced simulation models.


Results from running the SUMMIT template (screenshot above) provide an integrated picture of what has occurred in the simulated scenario. Results of interest include plume dispersion and the location of hospitals and urgent care facilities. Associated medical resource needs are shown in tables and plots. (Click for larger image)



The software package, known as the Standard Unified Modeling, Mapping & Integration Toolkit (SUMMIT),  will help a range of exercise professionals from the federal, regional, and local levels tap into existing models to ensure consistency, accuracy, and robustness when exercise scenarios are developed and played out. Work on SUMMIT is the central focus of the DHS-directed Integrated Modeling, Mapping, and Simulation (IMMS) project, now in its second year.


SUMMIT is not a new modeling or simulation tool, but raher is meant to take advantage of the significant investments that DHS and others have made throughout the years, explains Jalal Mapar, the DHS/S&T program manager.

“The departments of Energy, Defense, and Homeland Security — among others — already do a lot of modeling and simulation,” says Jalal. “What is urgently needed, then, is not a whole new set of models, but an easy and user-friendly way to access and link together the most appropriate models for a given emergency drill.”

Though current modeling tools are effective on their own, says Jalal, they all incorporate different assumptions that currently require a large amount of time, resources, and human effort to properly synchronize each model’s output.

For instance, an exercise scenario might involve an improvised nuclear device blast in a large midwestern American city. The exercise planners, in order to create the most accurate and effective scenario, need to know details on the device, blast effects, numbers of immediate casualties, information on damaged buildings and infrastructure, radiation exposure to citizens, and other key pieces of information. While current models exist to provide the information in piecemeal fashion, there is no automated method for sharing information among the various models.

“You can’t really calculate infrastructure impact until you know the technical details of the blast,” says IMMS principal investigator Karim Mahrous (8116), pointing out the dilemma in having separate data sets that don’t “talk” to one another. “We need a technical solution for linking the models together instead of having humans do the expensive and time-consuming interpretation themselves.”

SUMMIT, says Karim, does not require exercise planners and participants to be experts in modeling and simulation. Instead, they merely would input the information they need for a particular scenario, and then allow SUMMIT to process the information.

“With minimal user input, the system links to the appropriate models, takes the information the planner has given it, pushes it out to the models in the proper format, does any necessary translations to make sure everything is consistent, then brings back an integrated story that can be used in the exercise,” Karim says. The SUMMIT system will work with as few as two models, or it could involve many. “It will be largely invisible to the user,” says Karim.

The broader goal, says Jalal, is to make the SUMMIT capability a pervasive part of preparedness and response for emergency managers, responders, and exercise teams in federal, state, and local governments. -- Mike Janes

Top of page
Return to Lab News home page


Reinvigorated Center for Cyber Defenders taking shape at Sandia/California site

By Mike Janes


Sandia’s well-known Center for Cyber Defenders (CCD) initiative, which has enjoyed a long track record of success in Albuquerque and California, has seen its impact as an employment pipeline and developer of qualified cybersecurity professionals decrease in the Golden State in recent years.

But things are about to change.


DIGITAL NATIVES like Jon Avery, Ben Schmoker, and Steve Cramer are among a new crop of Sandia/California student interns taking part in the Center for Cyber Defenders. They and future CCD interns in California are expected to take on the hard task of structuring and developing experimental science around cybersecurity and information technology. (Photo by Dino Vournas)

Bob Hutchinson (8960), manager of the computer sciences and information operations group, has set a robust goal for Sandia/California’s new CCD effort: 50 students in the summer of 2011, with a long-term objective of 100 or more students in the summer and perhaps as many as 50 students year-round. Bob and director Len Napolitano (8900) envision a CCD in California that is grounded in science and engineering, with a solid understanding of cyber vulnerabilities.

Though the CCD started at Sandia/California in the 1990s and enjoyed a good deal of accolades and success, the number of interns began to slow when the academic world “caught up” with the Labs’ original vision, says Bob.

“The original CCD concept was born out of necessity,” he says. “It was quite simple: We needed to develop employees who could work in computer security, and no one else was doing this sort of thing at the time. We exposed them to projects and research and concepts in information security so that they could get a sense for whether it was the right career for them.

“It was a tremendous success,” Bob continues. “But then universities started developing coursework in computer security, cryptography, the fundamentals of information assurance, and even digital forensics. So once there was no longer a need for us to inform the students of what it would be like to work in this field — a need that was suddenly being met by the universities — the CCD program in California became less of a factor.”

The CCD program in New Mexico, says Bob, has continued to thrive with a project-based framework, one in which students can help Work for Others (WFO) sponsors with specific information security challenges.

But a strictly project-based format may not continue to work as well into the future, says Bob, since the attractiveness of having student interns versus regular staff executing a project will naturally diminish as “digital natives” become the norm in the workplace and replace their “digital convert” predecessors.

(Generally speaking, a digital native is someone who was born after the emergence of digital technology and, as a result, has a lifetime of familiarity with computers, the Internet, mobile smart phones, and other digital devices.)

The appeal of bringing in new students will eventually fade, Bob says, since the expertise they now offer will become routine.

Instead, Bob says, a more suitable role for Sandia is to transition the CCD from what is today an art and a practice of cybersecurity into one that is more based in engineering and science principles. This would include fundamental issues associated with the security of systems and the vulnerabilities inherent in computer systems.

“Ultimately, it is people who attack computer systems,” says Bob. “So we need individuals who are trained to understand the cognitive and social science aspects of computer security.” Bob muses that the CCD might even be able to publish a book one day on the theory of vulnerabilities that serves as the basis for producing more robust computer systems.

The revamped CCD in California also want students who are more proof-and-reason based, Bob says, individuals who want to take on the hard task of structuring and developing experimental science around cybersecurity and information technology.

The driving force behind the CCD continues to be the federal government’s need to have a hiring pool of cybersecurity professionals to draw upon. “We need to help the government acquire a supply of high-quality labor,” Bob asserts. “They need to be threat-informed, and they need to have the skills and tools that can wipe out classes of vulnerabilities, as opposed to today’s model, which is very much a ‘find-a-vulnerability, fix-a-vulnerability’ model,” he says.

In addition to project work and ongoing access to a pool of cybertalent, a key part of the CCD’s value in California will be to deliver positive publicity for sponsors. “A sponsor’s involvement in the CCD demonstrates that they’re investing in people, in the development of a workforce, and that they’re committed to making the US stronger through its educational programs. It shows that they’re forward-thinking,” Bob says.

In developing the program, Sandia is exploring a variety of funding sources and potential partners, including universities. “We may even need to create relationships with schools that are willing to assign course credit for coming here and working,” Bob says.

He acknowledges that he’s less familiar with industry models for talent acquisition yet is still optimistic that valuable relationships can be forged. “Maybe we can provide value by saying, ‘You want to develop this type of a software security product, and through our internship program, we can mentor students to help develop the concepts and the product,’” Bob says. “Or an industry partner might be persuaded to send its new employees to our program to quickly develop an elevated threat awareness and build a peer network.”

For now, Bob says he and his colleagues are developing a compelling value statement for government and industry sponsorship and identifying the many tasks and activities that need to be completed prior to the CCD’s doors being thrown open next summer.

“The biggest problem with cybersecurity today is that it’s been over-admired,” Bob concludes. “Go online and look at cybersecurity, and what you’ll find is a wealth of analysis and papers out there, and every one of them describes the problem a little better than the previous one.

“But where are the solutions going to come from? They’re going to come from people conducting experiments and reasoning, developing a theory of vulnerabilities, then taking that theory and moving it into real applications. Sandia’s CCD is here to provide those solutions.” -- Mike Janes

Top of page
Return to Lab News home page