skip to: onlinetools | mainnavigation | content | footer

Newsroom

SANDIA LAB NEWS

Lab News -- Nov. 19, 2010

November 19, 2010

LabNews 11/19/2010PDF (1.1 Mb)

Sandia images the sea monster of nuclear fusion: the Rayleigh-Taylor instability

By Neal Singer

Researchers at Sandia’s Z machine this summer applied a new X-ray imaging capability to take pictures of a critical instability at the heart of the huge accelerator. The pictures may help remove a major impediment in the worldwide, multidecade, multibillion dollar effort to harness nuclear fusion to generate electrical power from seawater.


BUT IS IT ART? Dan Sinars examines one of the aluminum cylinders used in the Z pulsed power experiments. The monitor on the X-ray machine in the background displays a highly-magnified, pre-experiment view of the wavering edges machined into the outside edge of the cylinder. These were used to intentionally start the growth of the instability. (Photo by Randy Montoya)

 “These are the first controlled measurements of the growth of magneto-Rayleigh-Taylor [MRT] instabilities in fast Z-pinches,” says project lead Daniel Sinars (1643).

 MRT instabilities are spoilers that arise wherever electromagnetic forces are used to contract (pinch) a plasma, which is essentially a cloud of ions. The pinch method is the basis of the operation of Z, a dark-horse contender in the fusion race.

A pinch contracts plasma so suddenly and tightly that hydrogen isotopes available from seawater, placed in a capsule, should fuse.

That’s the intent. Instead, the instability crimps the cylindrically contracting plasma until it resembles a string of sausages or shreds the plasma into more fantastic, equally useless shapes. This damaged contraction loses the perfect symmetry of forces necessary to fuse material.

Fast pinches at Z, which take place in less than 100 nanoseconds, already have produced some neutrons, a proof of fusion. But a major reason not enough neutrons have been produced to provide a source of peacetime electrical power is the MRT instability.

Dan led seven experimental shots to map the disturbance. The experiments were motivated by a concept proposed last year by Steve Slutz (1644). Traditionally, scientists would use an array of spidery wires to create a compressed, X-ray-generating ion cloud. The X-rays were then used to compress fusion fuel. Steve suggested instead that the magnetic pinching forces could be used to directly fuse fuel by compressing a solid aluminum liner around fusion material preheated by a laser.

 Because the new concept would not produce X-rays as a heating tool but instead relied on directly compressing the fuel with magnetic pressure, the MRT instability was the primary threat to the concept.

 “Once we started looking at solid liners, it was easy to conceive of doing a controlled experiment to study the growth of the instability,” says Dan.

This is because experimenters could etch solid tubes to whatever degree they desired to provoke the instability. Accurate etching is not an option with fragile wire arrays.

The MRT problem occurs because even minute dips in a current-carrying surface — imperfections merely 10 nanometers in amplitude — can grow exponentially in amplitude to millimeter scales. In the experiments by Dan and others, the tubes were scored with a sinusoidal perturbation to intentionally start this process.

“The series of pictures over a time scale of 100 nanoseconds brought the life of the MRT into focus,” says Dan.

Previously, competing computer simulation programs had given conflicting predictions as to the extent of the threat posed by the MRT instability, leaving researchers in the position, says Dan, of “a man with two watches: he never really knows what time it is.”

The more accurate simulations will enable researchers to better tweak the conditions of future Z firings, more effectively combating the effect of the instability.

Researchers believe that with thick liners and control of the MRT, Z could achieve an output of 100 kilojoules to match the 100 kilojoules input to the fuel to start the fusion reaction. “That would be scientific breakeven,” says Dan. “No one has achieved that.”

But that day, he says, may be just two to three years away.

The work is reported in a paper in the Oct. 29 issue of Physical Review Letters and was the subject of Dan’s invited talk on Nov. 17 at the APS Plasma Physics meeting in Chicago. The work was paid for by Sandia’s Laboratory Directed Research and Development program and DOE.

-- Neal Singer

Top of page
Return to Lab News home page


Small-scale technologies for large-scale biofuels production

By Patti Koning

You could call November “Microfluidics meets Bioenergy Research” month. Work led by Rajiv Bharadwaj and Aarthi Chandrasekaran (both 8621), and performed at the Joint Bio Energy Institute (JBEI), this month appears in three prominent scientific journals, including on the cover of Analytical Chemistry.


Small but speedy Rajiv Bharadwaj holds up the chip that is the basis of a high-throughput method to evaluate the effectiveness of digestion methods. The method, featured on the cover of Analytical Chemistry this month, provides quantitative information about all sugars in a sample, not just the monosaccharides. (Photo by Randy Wong)


“We’re coming back from a 35-year hiatus on bioenergy research. There was a lot of effort in the 1960s and 1970s, but when gas prices tanked, the lack of funding and political will brought the work to a halt,” says Anup Singh (8621). “With the crises in oil-producing nations and the sudden spike in oil prices a few years ago, research has started up again with a major influx of funds from government and industry.” Anup is the manager of the biosystems and bioengineering group at Sandia and director for high-throughput chemical analysis at JBEI.

Because of that long pause, the tools to conduct basic research are sorely out of date. Advances in other areas of biology often don’t easily translate to bioenergy.

“We need tools that allow researchers to screen processes and molecules in a high-throughput manner and do assays faster and cheaper with smaller amounts,” Anup says. “The microfluidics core at JBEI has been examining the state of the art to see how the lack of technology is slowing down researchers in terms of getting to their biological end goals. We now have some great successes, with a common theme of increasing speed and efficiency, to share with the rest of the bioenergy community.”

Rajiv is the lead author of a paper titled “Microfluidic glycosyl hydrolase screening for biomass-to-biofuel conversion,” the cover story for the Nov. 15 issue of Analytical Chemistry. The other authors (all part of JBEI) are Zhiwei Chen (8634), Supratim Datta (8634), Bradley Holmes (8634), Rajat Sapra (8634), Blake Simmons (8630), Paul Adams (Lawrence Berkeley National Laboratory), and Anup.

This paper describes a new method for screening the effectiveness of digestion, the process by which treated biomass is converted into fermentable sugars. To evaluate a particular digestion method, researchers need to know the profile of the various sugars it produces. The characterization of various oligosaccharides produced during biomass digestion is critical for the design of suitable reactors, enzyme cocktail compositions, and biomass pretreatment schemes.

The state-of-the-art screening technique is high-performance liquid chromatography (HPLC), which takes several hours to complete. Rajiv’s research moved this screening process to a chip, drastically reducing the amount of sample needed and slashing the time from hours to just minutes.

“Our goal was to create something you can keep running all day and process hundreds of samples in the time it used to take to do one. This method is much more sophisticated because it provides quantitative information about everything in the sample, not just the monosaccharides,” he says.

This chip-based screening method also has potential for industrial settings, Rajiv adds. “Industrial production will be greatly scaled up from what we are doing in the lab, so the ability to troubleshoot quickly is essential,” he explains. “You can also use this method to prescreen each batch of biomass digestion so you don’t waste fermenter time.”

Another project of Rajiv’s, described in the paper, “High-throughput enzymatic hydrolysis of lignocellulosic biomass via in-situ regeneration,” appears in Bioresource Technology. The other authors are April Wong (8621), Bernard Knierim (JBEI), Seema Singh (8934), Bradley, Manfred Auer (JBEI), Blake, Paul, and Anup.

This project tackles the logistical problem of accurately metering small amounts of insoluble biomass for enzymatic digestion. The metering of ionic liquid (IL) pretreated biomass is typically performed either by weighing biomass or dispensing a biomass slurry solution. Both of these approaches are cumbersome and prone to variability, especially for high-throughput screening.

Rajiv hit upon a novel solution by observing that when IL-solublized biomass is placed in a microwell containing water, the biomass precipitates to the bottom. In this volumetric metering method, a researcher washes the biomass with water and adds the enzyme, all within the same microwell. No need for weighing biomass or metering slurry.

This method is also an ideal way to prepare a pure sample of biomass for imaging. “The imaging people see this as an imaging platform,” Rajiv says. “That wasn’t our original intent, but it’s another application of this method.”        

Speed and efficiency also are the focus of the third JBEI paper. “A microscale platform for integrated cell-free expression and activity screening of cellulases” appears in the November issue of the Journal of Proteome Research. Aarthi is the lead author with contributions from Rajiv, Joshua Park (8634), Rajat, Paul, and Anup. 

This work addresses the problem of screening hundreds of thousands of variants of cellulases, which is currently done by expressing into E. coli or yeast, a process that can take weeks.

“The question is, ‘can we bypass the entire process by doing the first level of screening in something that is higher throughput, much faster, and simpler?’ And expressing cell-free, with no need for living E. coli?” Aarthi asks.

The research team developed a first-pass screening device for quantitative large-scale screening of cellulase variants. They adapted commercial off-the-shelf cell-free expression kits to express a large number of celullases. Using a microfluidics platform, the scientists integrated an assay for evaluation and connected it to a fluorescent readout, enabling what Aarthi terms “a one-stop shop.”

“It’s a quick, simple solution to a very costly problem,” she says. “The microfluidics platform achieves the entire process of transcription, translation, and activity screening within two or three hours, compared with the days necessary for conventional cell-based cellulase expression, purification, and activity screening.”

By performing expression and screening within the same reaction volume, researchers can express many variants of cellulases and know almost immediately if they are active or not. Scaling down the dimension and volume to a microfluidics platform accelerates the process and reduces the amount of reagent needed, a tremendous cost savings.

“The methods described in these papers will enable scientists at JBEI and beyond to push the boundaries of bioenergy research,” Anup says. “The faster we can screen and evaluate components of the biomass to biofuel process, the closer we get to moving biofuels from a concept to a reality.”

JBEI is now one of three new DOE Bioenergy Research Centers. This San Francisco Bay Area scientific partnership is led by Lawrence Berkeley National Laboratory and includes Sandia, the University of California (UC) campuses at Berkeley and Davis, the Carnegie Institution for Science, and Lawrence Livermore National Laboratory. -- Patti Koning

Top of page
Return to Lab News home page


Annual survey tracks evolving attitudes about national security issues

By Shannon Guess


More than three-quarters of respondents in a recent national survey said they believe today’s world is a more dangerous place for the US than it was during the Cold War. The same survey found that in the past year Americans perceive the effectiveness of security efforts at US airports, seaports, and the nation’s borders is declining. The survey also found that within the next 20 years, Americans would prefer to see an energy mix comprising 50 percent renewable energy sources (currently 6 percent), 28 percent fossil fuels (currently 85 percent), and 22 percent nuclear generation (currently 8 percent).

 The findings come from the 2010 National Security Public Attitudes Project, conducted by principal investigators Hank Jenkins-Smith, Kerry Herron, and Carol Silva of the Center for Applied Social Research (CASR) at the University of Oklahoma. Conducted since 1993, this unique annual study assesses Americans’ attitudes toward a broad range of national security issues. Sandia cofunds the study with the University of Oklahoma and uses the results to help inform its mission areas.

Results of the latest survey were presented at Sandia in late October.

 “Although we, as Sandians, are immersed in elements of national security in real and tangible ways, it is important to keep in mind that the public’s attitudes and understanding of issues play an essential role in setting national policy,” says Lori Parrott, manager of Strategic Studies Dept. 552. “For example, the public plays a key role in shaping national security priorities and outcomes. Understanding the public’s perceptions of their safety and security, as they relate to dimensions of national security, assists us in understanding how best to inform national debate.”

            The National Security Public Attitudes Project is a unique study in two significant ways. No other study has been conducted over as long a period of time, and no other study covers as broad a scope of national security topics.

“While similar studies focus on public attitudes toward specific topics, the National Security Public Attitudes Project assesses attitudes toward many dimensions of energy security, nuclear security, and the threat of terrorism,” Lori says. “Its scope allows us to generate a much more complete view of how the public perceives national security.”

With regard to energy security, the study’s 2010 results revealed that respondents’ confidence in the nation’s energy future is on an upward trend, increasing from 5.01 in 2008 to 5.63 (on a Likert-type scale, with 1 meaning “not confident at all” and 10 meaning “completely confident”). Further, respondents reported that the most important prospective energy technology R&D would be in areas of solar, wind, and hydro energy, while they ranked clean coal, nuclear energy, and oil and gas as least important for investment.

            Shifting to the second portion of the study, public attitudes on nuclear security mirror the complexity of the issue. While many respondents consider nuclear abolition to be desirable, 80 percent do not think it is feasible.

            According to the study’s results, 73 percent of respondents believe that “the US nuclear arsenal deters attacks and ensures our security, and that these benefits far outweigh any risks from US nuclear weapons.” Further, a large majority does not want the US to have fewer nuclear weapons than any other country and eight out of 10 people reject unilateral US nuclear disarmament.

Respondents said they believe nuclear weapons are relevant for deterring other countries from using nuclear weapons against the US, deterring other countries from providing nuclear weapons or material to terrorist groups, and for helping maintain US international influence, status, and military superiority. However, a nearly even split emerged concerning the effectiveness of the US nuclear arsenal in deterring the use of weapons of mass destruction by terrorists and nonstate actors, with 51 percent perceiving great utility and 49 percent perceiving little utility).

            In addition to issues related to nuclear weapons and energy policy, the 2010 National Security Public Attitudes Project also questioned respondents about the threat of terrorism. Although predictably peaking right after the Sept. 11 attacks, respondents’ average assessment of the overall threat of terrorism in the US has declined approximately 16 percent over the years, though currently remaining well above pre-Sept. 11 assessments (7.55 on a Likert-type scale with 0 meaning “no threat” and 10 meaning “extreme threat”).

            Respondents indicated midscale confidence in the US government to prevent terrorist attacks in the US, with a score of 5.01 on a Likert-type scale, with 1 meaning “not confident at all” and 10 meaning “completely confident”), as well as midscale confidence in the nation’s ability to accurately assess threats of terrorism both at home and abroad (4.77 in US and 4.46 abroad).

            The 2010 survey also probed respondents’ opinions about measures designed to mitigate the threat of terrorism. The top measures for which respondents reported tolerance were restricting immigration (70 percent), requiring national identification cards (60 percent), and monitoring telephone calls (49 percent). The areas of least reported tolerance were monitoring behaviors (54 percent), taking unauthorized photos (55 percent), and sampling DNA (50 percent). Tolerance for trading personal liberties for increased levels of security increases with age, political conservatism, and perceptions of the terror threat.

            The 2010 National Security Public Attitudes Project will soon be available as a SAND report.

- Shannon Guess

Top of page
Return to Lab News home page