Globally, there is no lack of security threats. Many of them demand priority engagement and there can never be adequate resources to address all threats. In this context, climate is just another aspect of global security and the Arctic just another region. In light of physical and budgetary constraints, new security needs must be integrated and prioritized with existing ones. This discussion approaches the security impacts of climate from that perspective, starting with the broad security picture and establishing how climate may affect it. This method provides a different view from one that starts with climate and projects it, in isolation, as the source of a hypothetical security burden. That said, the Arctic does appear to present high-priority security challenges. Uncertainty in the timing of an ice-free Arctic affects how quickly it will become a security priority. Uncertainty in the emergent extreme and variable weather conditions will determine the difficulty (cost) of maintaining adequate security (order) in the area. The resolution of sovereignty boundaries affects the ability to enforce security measures, and the U.S. will most probably need a military presence to back-up negotiated sovereignty agreements. Without additional global warming, technology already allows the Arctic to become a strategic link in the global supply chain, possibly with northern Russia as its main hub. Additionally, the multinational corporations reaping the economic bounty may affect security tensions more than nation-states themselves. Countries will depend ever more heavily on the global supply chains. China has particular needs to protect its trade flows. In matters of security, nation-state and multinational-corporate interests will become heavily intertwined.
This document contains a summary of the work performed under the LDRD project entitled 'Interface Physics in Microporous Media'. The presence of fluid-fluid interfaces, which can carry non-zero stresses, distinguishes multiphase flows from more readily understood single-phase flows. In this work the physics active at these interfaces has been examined via a combined experimental and computational approach. One of the major difficulties of examining true microporous systems of the type found in filters, membranes, geologic media, etc. is the geometric uncertainty. To help facilitate the examination of transport at the pore-scale without this complication, a significant effort has been made in the area of fabrication of both two-dimensional and three-dimensional micromodels. Using these micromodels, multiphase flow experiments have been performed for liquid-liquid and liquid-gas systems. Laser scanning confocal microscopy has been utilized to provide high resolution, three-dimensional reconstructions as well as time resolved, two-dimensional reconstructions. Computational work has focused on extending lattice Boltzmann (LB) and finite element methods for probing the interface physics at the pore scale. A new LB technique has been developed that provides over 100x speed up for steady flows in complex geometries. A new LB model has been developed that allows for arbitrary density ratios, which has been a significant obstacle in applying LB to air-water flows. A new reduced order model has been developed and implemented in finite element code for examining non-equilibrium wetting in microchannel systems. These advances will enhance Sandia's ability to quantitatively probe the rich interfacial physics present in microporous systems.
This short-term, late-start LDRD examined the effects of nutritional deprivation on the energy harvesting complex in microalgae. While the original experimental plan involved a much more detailed study of temperature and nutrition on the antenna system of a variety of TAG producing algae and their concomitant effects on oil production, time and fiscal constraints limited the scope of the study. This work was a joint effort between research teams at Sandia National Laboratories, New Mexico and California. Preliminary results indicate there is a photosystem response to silica starvation in diatoms that could impact the mechanisms for lipid accumulation.
The performance of the Advanced Synthetically Enhanced Detector Resolution Algorithm (ASEDRA) was evaluated by performing a blind test of 29 sets of gamma-ray spectra that were provided by DNDO. ASEDRA is a post-processing algorithm developed at the Florida Institute of Nuclear Detection and Security at the University of Florida (UF/FINDS) that extracts char-acteristic peaks in gamma-ray spectra. The QuickID algorithm, also developed at UF/FINDS, was then used to identify nuclides based on the characteristic peaks generated by ASEDRA that are inferred from the spectra. The ASEDRA/QuickID analysis results were evaluated with respect to the performance of the DHSIsotopeID algorithm, which is a mature analysis tool that is part of the Gamma Detector Response and Analysis Software (GADRAS). Data that were used for the blind test were intended to be challenging, and the radiation sources included thick shields around the radioactive materials as well as cargo containing naturally occurring radio-active materials, which masked emission from special nuclear materials and industrial isotopes. Evaluation of the analysis results with respect to the ground truth information (which was provided after the analyses were finalized) showed that neither ASEDRA/QuickID nor GADRAS could identify all of the radiation sources correctly. Overall, the purpose of this effort was primarily to evaluate ASEDRA, and GADRAS was used as a standard against which ASEDRA was compared. Although GADRAS was somewhat more accurate on average, the performance of ASEDRA exceeded that of GADRAS for some of the unknowns. The fact that GADRAS also failed to identify many of the radiation sources attests to the difficulty of analyzing the blind-test data that were used as a basis for the evaluation. This evaluation identified strengths and weaknesses of the two analysis approaches. The importance of good calibration data was also clear because the performance of both analysis methods was impeded by the inability to define the energy calibration accurately. Acronyms ACHIP adaptive chi-processed ASEDRA Advanced Synthetically Enhanced Detector Resolution Algorithm DNDO Domestic Nuclear Detection Office DRFs Detector Response Functions FINDS Florida Institute of Nuclear Detection and Security FWHM full-width half-maximum GADRAS Gamma Detector Response Analysis Software GUI graphical user interface HEU highly enriched uranium HPGe high purity germanium ID identification NaI Sodium iodide NNSA National Nuclear Security Administration NORM Naturally Occurring Radioactive Materials ppm parts per million SNL Sandia National Laboratories UF University of Florida WGPu weapons-grade plutonium
Nano-structured palladium is examined as a tritium storage material with the potential to release beta-decay-generated helium at the generation rate, thereby mitigating the aging effects produced by enlarging He bubbles. Helium retention in proposed structures is modeled by adapting the Sandia Bubble Evolution model to nano-dimensional material. The model shows that even with ligament dimensions of 6-12 nm, elevated temperatures will be required for low He retention. Two nanomaterial synthesis pathways were explored: de-alloying and surfactant templating. For de-alloying, PdAg alloys with piranha etchants appeared likely to generate the desired morphology with some additional development effort. Nano-structured 50 nm Pd particles with 2-3 nm pores were successfully produced by surfactant templating using PdCl salts and an oligo(ethylene oxide) hexadecyl ether surfactant. Tests were performed on this material to investigate processes for removing residual pore fluids and to examine the thermal stability of pores. A tritium manifold was fabricated to measure the early He release behavior of this and Pd black material and is installed in the Tritium Science Station glove box at LLNL. Pressure-composition isotherms and particle sizes of a commercial Pd black were measured.
Manufactured parts are designed with acceptance tolerances, i.e. deviations from ideal design conditions, due to unavoidable errors in the manufacturing process. It is necessary to measure and evaluate the manufactured part, compared to the nominal design, to determine whether the part meets design specifications. The scope of this research project is dimensional acceptance of machined parts; specifically, parts machined using numerically controlled (NC, or also CNC for Computer Numerically Controlled) machines. In the design/build/accept cycle, the designer will specify both a nominal value, and an acceptable tolerance. As part of the typical design/build/accept business practice, it is required to verify that the part did meet acceptable values prior to acceptance. Manufacturing cost must include not only raw materials and added labor, but also the cost of ensuring conformance to specifications. Ensuring conformance is a substantial portion of the cost of manufacturing. In this project, the costs of measurements were approximately 50% of the cost of the machined part. In production, cost of measurement would be smaller, but still a substantial proportion of manufacturing cost. The results of this research project will point to a science-based approach to reducing the cost of ensuring conformance to specifications. The approach that we take is to determine, a priori, how well a CNC machine can manufacture a particular geometry from stock. Based on the knowledge of the manufacturing process, we are then able to decide features which need further measurements from features which can be accepted 'as is' from the CNC. By calibration of the machine tool, and establishing a machining accuracy ratio, we can validate the ability of CNC to fabricate to a particular level of tolerance. This will eliminate the costs of checking for conformance for relatively large tolerances.
Advanced optically-activated solid-state electrical switch development at Sandia has demonstrated multi-kA/kV switching and the path for scalability to even higher current/power. Realization of this potential requires development of new optical sources/switches based on key Sandia photonic device technologies: vertical-cavity surface-emitting lasers (VCSELs) and photoconductive semiconductor switch (PCSS) devices. The key to increasing the switching capacity of PCSS devices to 5kV/5kA and higher is to distribute the current in multiple parallel line filaments triggered by an array of high-brightness line-shaped illuminators. Commercial mechanically-stacked edge-emitting lasers have been used to trigger multiple filaments, but they are difficult to scale and manufacture with the required uniformity. In VCSEL arrays, adjacent lasers utilize identical semiconductor material and are lithographically patterned to the required dimensions. We have demonstrated multiple-line filament triggering using VCSEL arrays to approximate line generation. These arrays of uncoupled circular-aperture VCSELs have fill factors ranging from 2% to 30%. Using these arrays, we have developed a better understanding of the illumination requirements for stable triggering of multiple-filament PCSS devices. Photoconductive semiconductor switch (PCSS) devices offer advantages of high voltage operation (multi-kV), optical isolation, triggering with laser pulses that cannot occur accidentally in nature, low cost, high speed, small size, and radiation hardness. PCSS devices are candidates for an assortment of potential applications that require multi-kA switching of current. The key to increasing the switching capacity of PCSS devices to 5kV/5kA and higher is to distribute the current in multiple parallel line filaments triggered by an array of high-brightness line-shaped illuminators. Commercial mechanically-stacked edge-emitting lasers have been demonstrated to trigger multiple filaments, but they are difficult to scale and manufacture with the required uniformity. As a promising alternative to multiple discrete edge-emitting lasers, a single wafer of vertical-cavity surface-emitting lasers (VCSELs) can be lithographically patterned to achieve the desired layout of parallel line-shaped emitters, in which adjacent lasers utilize identical semiconductor material and thereby achieve a degree of intrinsic optical uniformity. Under this LDRD project, we have fabricated arrays of uncoupled circular-aperture VCSELs to approximate a line-shaped illumination pattern, achieving optical fill factors ranging from 2% to 30%. We have applied these VCSEL arrays to demonstrate single and dual parallel line-filament triggering of PCSS devices. Moreover, we have developed a better understanding of the illumination requirements for stable triggering of multiple-filament PCSS devices using VCSEL arrays. We have found that reliable triggering of multiple filaments requires matching of the turn-on time of adjacent VCSEL line-shaped-arrays to within approximately 1 ns. Additionally, we discovered that reliable triggering of PCSS devices at low voltages requires more optical power than we obtained with our first generation of VCSEL arrays. A second generation of higher-power VCSEL arrays was designed and fabricated at the end of this LDRD project, and testing with PCSS devices is currently underway (as of September 2008).
Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worth of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high consequence decision-making.
Realistic cell models could greatly accelerate our ability to engineer biochemical pathways and the production of valuable organic products, which would be of great use in the development of biofuels, pharmaceuticals, and the crops for the next green revolution. However, this level of engineering will require a great deal more knowledge about the mechanisms of life than is currently available. In particular, we need to understand the interactome (which proteins interact) as it is situated in the three dimensional geometry of the cell (i.e., a situated interactome), and the regulation/dynamics of these interactions. Methods for optical proteomics have become available that allow the monitoring and even disruption/control of interacting proteins in living cells. Here, a range of these methods is reviewed with respect to their role in elucidating the interactome and the relevant spatial localizations. Development of these technologies and their integration into the core competencies of research organizations can position whole institutions and teams of researchers to lead in both the fundamental science and the engineering applications of cellular biology. That leadership could be particularly important with respect to problems of national urgency centered around security, biofuels, and healthcare.
Predictive simulation of systems comprised of numerous interconnected, tightly coupled components promises to help solve many problems of scientific and national interest. However predictive simulation of such systems is extremely challenging due to the coupling of a diverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure to gain computational efficiency. The traditional layering of uncertainty quantification around nonlinear solution processes is inverted to allow for heterogeneous uncertainty quantification methods to be applied to each component in a coupled system. Moreover this approach allows stochastic dimension reduction techniques to be applied at each coupling interface. The mathematical feasibility of these ideas is investigated in this report, and mathematical formulations for the resulting stochastically coupled nonlinear systems are developed.
This report documents the results of a Phenomena Identification and Ranking Table (PIRT) exercise performed at Sandia National Laboratories (SNL) as well as the experimental and modeling program that have been designed based on the PIRT results. A PIRT exercise is a structured and facilitated expert elicitation process. In this case, the expert panel was comprised of nine recognized fire science and aerosol experts. The objective of a PIRT exercise is to identify phenomena associated with the intended application and to then rank the current state of knowledge relative to each identified phenomenon. In this particular PIRT exercise the intended application was sodium fire modeling related to sodium-cooled advanced reactors. The panel was presented with two specific fire scenarios, each based on a hypothetical sodium leak in an Advanced Breeder Test Reactor (ABTR) design. For both scenarios the figure of merit was the ability to predict the thermal and aerosol insult to nearby equipment (i.e. heat exchangers and other electrical equipment). When identifying phenomena of interest, and in particular when ranking phenomena importance and the adequacy of existing modeling tools and data, the panel was asked to subjectively weigh these factors in the context of the specified figure of merit. Given each scenario, the panel identified all those related phenomena that are of potential interest to an assessment of the scenario using fire modeling tools to evaluate the figure of merit. Each phenomenon is then ranked relative to its importance in predicting the figure of merit. Each phenomenon is then further ranked for the existing state of knowledge with respect to the ability of existing modeling tools to predict that phenomena, the underlying base of data associated with the phenomena, and the potential for developing new data to support improvements to the existing modeling tools. For this PIRT two hypothetical sodium leak scenarios were evaluated for the ABTR design. The first scenario was a leak in the hot side of the intermediate heat transport system (IHTS) resulting in a sodium pool fire. The second scenario was a leak in the cold side of the IHTS resulting in a sodium spray fire.
We developed prototype chemistry for nucleic acid hybridization on our bead-based diagnostics platform and we established an automatable bead handling protocol capable of 50 part-per-billion (ppb) sensitivity. We are working towards a platform capable of parallel, rapid (10 minute), raw sample testing for orthogonal (in this case nucleic acid and immunoassays) identification of biological (and other) threats in a single sensor microsystem. In this LDRD we developed the nucleic acid chemistry required for nucleic acid hybridization. Our goal is to place a non-cell associated RNA virus (Bovine Viral Diarrhea, BVD) on the beads for raw sample testing. This key pre-requisite to showing orthogonality (nucleic acid measurements can be performed in parallel with immunoassay measurements). Orthogonal detection dramatically reduces false positives. We chose BVD because our collaborators (UC-Davis) can supply samples from persistently infected animals; and because proof-of-concept field testing can be performed with modification of the current technology platform at the UC Davis research station. Since BVD is a cattle-prone disease this research dovetails with earlier immunoassay work on Botulinum toxin simulant testing in raw milk samples. Demonstration of BVD RNA detection expands the repertoire of biological macromolecules that can be adapted to our bead-based detection. The resources of this late start LDRD were adequate to partially demonstrate the conjugation of the beads to the nucleic acids. It was never expected to be adequate for a full live virus test but to motivate that additional investment. In addition, we were able to reduce the LOD (Limit of Detection) for the botulinum toxin stimulant to 50 ppb from the earlier LOD of 1 ppm. A low LOD combined with orthogonal detection provides both low false negatives and low false positives. The logical follow-on steps to this LDRD research are to perform live virus identification as well as concurrent nucleic acid and immunoassay detection.
2,4,6-Triazidoborazine is an explosive material that contains no carbon or oxygen. There is very little discussion of this material in the open literature, and due to the nature of this class of compounds, it is possible that a sophisticated adversary could produce and deploy this material. This work was undertaken to understand this material’s chemical and explosive properties. This paper documents the experimental procedure and results of this LORD.
This document describes the testing and facility requirements to support the Yucca Mountain Project long-term corrosion testing needs. The purpose of this document is to describe a corrosion testing program that will (a) reduce model uncertainty and variability, (b) reduce the reliance upon overly conservative assumptions, and (c) improve model defensibility. Test matrices were developed for 17 topical areas (tasks): each matrix corresponds to a specific test activity that is a subset of the total work performed in a task. A future document will identify which of these activities are considered to be performance confirmation activities. Detailed matrices are provided for FY08, FY09 and FY10 and rough order estimates are provided for FY11-17. Criteria for the selection of appropriate test facilities were developed through a meeting of Lead Lab and DOE personnel on October 16-17, 2007. These criteria were applied to the testing activities and recommendations were made for the facility types appropriate to carry out each activity. The facility requirements for each activity were assessed and activities were identified that can not be performed with currently available facilities. Based on this assessment, a total of approximately 10,000 square feet of facility space is recommended to meet all future testing needs, given that all testing is consolidated to a single location. This report is a revision to SAND2007-7027 to address DOE comments and add a series of tests to address NWTRB recommendations.
The West Pearl Queen is a depleted oil reservoir that has produced approximately 250,000 bbl of oil since 1984. Production had slowed prior to CO{sub 2} injection, but no previous secondary or tertiary recovery methods had been applied. The initial project involved reservoir characterization and field response to injection of CO{sub 2}; the field experiment consisted of injection, soak, and venting. For fifty days (December 20, 2002, to February 11, 2003) 2090 tons of CO{sub 2} were injected into the Shattuck Sandstone Member of the Queen Formation at the West Pearl Queen site. This technical report highlights the test results of the numerous research participants and technical areas from 2006-2008. This work included determination of lateral extents of the permeability units using outcrop observations, core results, and well logs. Pre- and post-injection 3D seismic data were acquired. To aid in interpreting seismic data, we performed numerical simulations of the effects of CO{sub 2} replacement of brine where the reservoir model was based upon correlation lengths established by the permeability studies. These numerical simulations are not intended to replicate field data, but to provide insight of the effects of CO{sub 2}.
We present computations of a methane-air edge flame stabilized against an incoming flow mixing layer, using detailed methane-air chemistry. We analyze the computed edge flame, with a focus on NO-structure. We examine the spatial distribution of NO and its production/consumption rate. We investigate the breakdown of the NO source term among the thermal, prompt, N{sub 2}O, and NO{sub 2} pathways. We examine the contributions of the four pathways at different locations, as the edge flame structure changes with downstream distance, tending to a classical diffusion flame structure. We also examine the dominant reaction flux contributions in each pathway. We compare the results to those in premixed, non-premixed, and opposed-jet triple flames.