Bayesian methods in CS&E models
Abstract not provided.
Abstract not provided.
This study demonstrates that containment of municipal and hazardous waste in arid and semiarid environments can be accomplished effectively without traditional, synthetic materials and complex, multi-layer systems. This research demonstrates that closure covers combining layers of natural soil, native plant species, and climatic conditions to form a sustainable, functioning ecosystem will meet the technical equivalency criteria prescribed by the U. S. Environmental Protection Agency. In this study, percolation through a natural analogue and an engineered cover is simulated using the one-dimensional, numerical code UNSAT-H. UNSAT-H is a Richards. equation-based model that simulates soil water infiltration, unsaturated flow, redistribution, evaporation, plant transpiration, and deep percolation. This study incorporates conservative, site-specific soil hydraulic and vegetation parameters. Historical meteorological data are used to simulate percolation through the natural analogue and an engineered cover, with and without vegetation. This study indicates that a 3-foot (ft) cover in arid and semiarid environments is the minimum design thickness necessary to meet the U. S. Environmental Protection Agency-prescribed technical equivalency criteria of 31.5 millimeters/year and 1 x 10{sup -7} centimeters/second for net annual percolation and average flux, respectively. Increasing cover thickness to 4 or 5 ft results in limited additional improvement in cover performance.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Effective, high-performance, networked file systems and storage is needed to solve I/O bottlenecks between large compute platforms. Frequently, parallel techniques such as PFTP, are employed to overcome the adverse effect of TCP's congestion avoidance algorithm in order to achieve reasonable aggregate throughput. These techniques can suffer from end-system bottlenecks due to the protocol processing overhead and memory copies involved in moving large amounts of data during I/O. Moreover, transferring data using PFTP requires manual operation, lacking the transparency to allow for interactive visualization and computational steering of large-scale simulations from distributed locations. This paper evaluates the emerging Internet SCSI (iSCSI) protocol [2] as the file/data transport in order that remote clients can transparently access data through a distributed global file system available to local clients. We started our work characterizing the performance behavior of iSCSI in Local Area Networks (LANs). We then proceeded to study the effect of propagation delay on throughput using remote iSCSI storage and explored optimization techniques to mitigate the adverse effects of long delay in high-bandwidth Wide Area Networks (WANs). Lastly, we evaluated iSCSI in a Storage Area Network (SAN) for a Global Parallel Filesystem. We conducted our benchmark based on typical usage model of large-scale scientific applications at Sandia. We demonstrated the benefit of high-performance parallel VO to scientific applications at the IEEE 2004 Supercomputing Conference, using experiences and knowledge gained from this study.
Proposed for publication in the International Journal of Packaging, Transport, Storage and Security of Radioactive Materials.
Techniques for mitigating the adsorption of {sup 137}Cs and {sup 60}Co on metal surfaces (e.g. RAM packages) exposed to contaminated water (e.g. spent-fuel pools) have been developed and experimentally verified. The techniques are also effective in removing some of the {sup 60}Co and {sup 137}Cs that may have been adsorbed on the surfaces after removal from the contaminated water. The principle for the {sup 137}Cs mitigation technique is based upon ion-exchange processes. In contrast, {sup 60}Co contamination primarily resides in minute particles of crud that become lodged on cask surfaces. Crud is an insoluble Fe-Ni-Cr oxide that forms colloidal-sized particles as reactor cooling systems corrode. Because of the similarity between Ni{sup 2+} and Co{sup 2+}, crud is able to scavenge and retain traces of cobalt as it forms. A number of organic compounds have a great specificity for combining with nickel and cobalt. Ongoing research is investigating the effectiveness of chemical complexing agent EDTA with regard to its ability to dissolve the host phase (crud) thereby liberating the entrained {sup 60}Co into a solution where it can be rinsed away.
Proposed for publication in Physical Review Letters.
Using a magnetic pressure drive, an absolute measurement of stress and density along the principal compression isentrope is obtained for solid aluminum to 240 GPa. Reduction of the free-surface velocity data relies on a backward integration technique, with approximate accounting for unknown systematic errors in experimental timing. Maximum experimental uncertainties are {+-}4.7% in stress and {+-}1.4% in density, small enough to distinguish between different equation-of-state (EOS) models. The result agrees well with a tabular EOS that uses an empirical universal zero-temperature isotherm.
Abstract not provided.
Spectral imaging where a complete spectrum is collected from each of a series of spatial locations (1D lines, 2D images or 3D volumes) is now available on a wide range of analytical tools - from electron and x-ray to ion beam instruments. With this capability to collect extremely large spectral images comes the need for automated data analysis tools that can rapidly and without bias reduce a large number of raw spectra to a compact, chemically relevant, and easily interpreted representation. It is clear that manual interrogation of individual spectra is impractical even for very small spectral images (< 5000 spectra). More typical spectral images can contain tens of thousands to millions of spectra, which given the constraint of acquisition time may contain between 5 and 300 counts per 1000-channel spectrum. Conventional manual approaches to spectral image analysis such as summing spectra from regions or constructing x-ray maps are prone to bias and possibly error. One way to comprehensively analyze spectral image data, which has been automated, is to utilize an unsupervised self-modeling multivariate statistical analysis method such as multivariate curve resolution (MCR). This approach has proven capable of solving a wide range of analytical problems based upon the counting of x-rays (SEM/STEM-EDX, XRF, PIXE), electrons (EELS, XPS) and ions (TOF-SIMS). As an example of the MCR approach, a STEM x-ray spectral image from a ZrB2-SiC composite was acquired and analyzed. The data were generated in a FEI Tecnai F30-ST TEM/STEM operated at 300kV, equipped with an EDAX SUTW x-ray detector. The spectral image was acquired with the TIA software on the STEM at 128 by 128 pixels (12nm/pixel) for 100msec dwell per pixel (total acquisition time was 30 minutes) with a probe of approximately the same size as each pixel. Each spectrum in the image had, on average, 500 counts. The calculation took 5 seconds on a PC workstation with dual 2.4GHz PentiumIV Xeon processors and 2Gbytes of RAM and resulted in four chemically relevant components, which are shown in Figure 1. The analysis region was at a triple junction of three ZrB2 grains that contained zirconium oxide, aluminum oxide and a glass phase. The power of unbiased statistical methods, such as MCR as applied here, is that no a priori knowledge of the material's chemistry is required. The algorithms, in this case, effectively reduced over 16,000 2000-channel spectra (64Mbytes) to four images and four spectral shapes (72kbytes), which in this case represent chemical phases. This three order of magnitude compression is achieved rapidly with no loss of chemical information. There is also the potential to correlate multiple analytical techniques like, for example, EELS and EDS in the STEM adding sensitivity to light elements as well as bonding information for EELS to the more comprehensive spectral coverage of EDS.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Reducing agricultural water use in arid regions while maintaining or improving economic productivity of the agriculture sector is a major challenge. Controlled environment agriculture (CEA, or, greenhouse agriculture) affords advantages in direct resource use (less land and water required) and productivity (i.e., much higher product yield and quality per unit of resources used) relative to conventional open-field practices. These advantages come at the price of higher operating complexity and costs per acre. The challenge is to implement and apply CEA such that the productivity and resource use advantages will sufficiently outweigh the higher operating costs to provide for overall benefit and viability. This project undertook an investigation of CEA for livestock forage production as a water-saving alternative to open-field forage production in arid regions. Forage production is a large consumer of fresh water in many arid regions of the world, including the southwestern U.S. and northern Mexico. With increasing competition among uses (agriculture, municipalities, industry, recreation, ecosystems, etc.) for limited fresh water supplies, agricultural practice alternatives that can potentially maintain or enhance productivity while reducing water use warrant consideration. The project established a pilot forage production greenhouse facility in southern New Mexico based on a relatively modest and passive (no active heating or cooling) system design pioneered in Chihuahua, Mexico. Experimental operations were initiated in August 2004 and carried over into early-FY05 to collect data and make initial assessments of operational and technical system performance, assess forage nutrition content and suitability for livestock, identify areas needing improvement, and make initial assessment of overall feasibility. The effort was supported through the joint leveraging of late-start FY04 LDRD funds and bundled CY2004 project funding from the New Mexico Small Business Technical Assistance program at Sandia. Despite lack of optimization with the project system, initial results show the dramatic water savings potential of hydroponic forage production compared with traditional irrigated open field practice. This project produced forage using only about 4.5% of the water required for equivalent open field production. Improved operation could bring water use to 2% or less. The hydroponic forage production system and process used in this project are labor intensive and not optimized for minimum water usage. Freshly harvested hydroponic forage has high moisture content that dilutes its nutritional value by requiring that livestock consume more of it to get the same nutritional content as conventional forage. In most other aspects the nutritional content compares well on a dry weight equivalent basis with other conventional forage. More work is needed to further explore and quantify the opportunities, limitations, and viability of this technique for broader use. Collection of greenhouse environmental data in this project was uniquely facilitated through the implementation and use of a self-organizing, wirelessly networked, multi-modal sensor system array with remote cell phone data link capability. Applications of wirelessly networked sensing with improved modeling/simulation and other Sandia technologies (e.g., advanced sensing and control, embedded reasoning, modeling and simulation, materials, robotics, etc.) can potentially contribute to significant improvement across a broad range of CEA applications.
Abstract not provided.
Proposed for publication in Chemistry Letters.
Abstract not provided.
Video and image data are knowledge-rich sources of information, but their utility for current and future systems is limited without autonomous methods for understanding and characterizing their content. Semantic-based video understanding may benefit systems dedicated to the detection of insiders, alarm patterns, unauthorized activities in material monitoring applications, etc. A direct benefit of this technology is not only intelligent alarm analysis, but the ability to browse and perform query-based searches for useful and interesting information after video data has been acquired and stored. These searches can provide a tremendous benefit for use in intelligence agency, government, military, and DOE site investigations. This report provides an initial investigation into the algorithms and methods needed to characterize and understand video content. Such algorithms include background modeling, detecting dynamic image regions, grouping dynamic pixels into coherent objects, and robust tracking strategies. With solid approaches for addressing these problems, analysis can be performed seeking to recognize distinctive objects and their motions leading to semantic-based video searches.
Abstract not provided.
This SAND report provides the technical progress through October 2004 of the Sandia-led project, %22Carbon Sequestration in Synechococcus Sp.: From Molecular Machines to Hierarchical Modeling,%22 funded by the DOE Office of Science Genomes to Life Program. Understanding, predicting, and perhaps manipulating carbon fixation in the oceans has long been a major focus of biological oceanography and has more recently been of interest to a broader audience of scientists and policy makers. It is clear that the oceanic sinks and sources of CO2 are important terms in the global environmental response to anthropogenic atmospheric inputs of CO2 and that oceanic microorganisms play a key role in this response. However, the relationship between this global phenomenon and the biochemical mechanisms of carbon fixation in these microorganisms is poorly understood. In this project, we will investigate the carbon sequestration behavior of Synechococcus Sp., an abundant marine cyanobacteria known to be important to environmental responses to carbon dioxide levels, through experimental and computational methods. This project is a combined experimental and computational effort with emphasis on developing and applying new computational tools and methods. Our experimental effort will provide the biology and data to drive the computational efforts and include significant investment in developing new experimental methods for uncovering protein partners, characterizing protein complexes, identifying new binding domains. We will also develop and apply new data measurement and statistical methods for analyzing microarray experiments. Computational tools will be essential to our efforts to discover and characterize the function of the molecular machines of Synechococcus. To this end, molecular simulation methods will be coupled with knowledge discovery from diverse biological data sets for high-throughput discovery and characterization of protein-protein complexes. In addition, we will develop a set of novel capabilities for inference of regulatory pathways in microbial genomes across multiple sources of information through the integration of computational and experimental technologies. These capabilities will be applied to Synechococcus regulatory pathways to characterize their interaction map and identify component proteins in these - 4 - pathways. We will also investigate methods for combining experimental and computational results with visualization and natural language tools to accelerate discovery of regulatory pathways. The ultimate goal of this effort is develop and apply new experimental and computational methods needed to generate a new level of understanding of how the Synechococcus genome affects carbon fixation at the global scale. Anticipated experimental and computational methods will provide ever-increasing insight about the individual elements and steps in the carbon fixation process, however relating an organism's genome to its cellular response in the presence of varying environments will require systems biology approaches. Thus a primary goal for this effort is to integrate the genomic data generated from experiments and lower level simulations with data from the existing body of literature into a whole cell model. We plan to accomplish this by developing and applying a set of tools for capturing the carbon fixation behavior of complex of Synechococcus at different levels of resolution. Finally, the explosion of data being produced by high-throughput experiments requires data analysis and models which are more computationally complex, more heterogeneous, and require coupling to ever increasing amounts of experimentally obtained data in varying formats. These challenges are unprecedented in high performance scientific computing and necessitate the development of a companion computational infrastructure to support this effort. More information about this project, including a copy of the original proposal, can be found at www.genomes-to-life.org Acknowledgment We want to gratefully acknowledge the contributions of the GTL Project Team as follows: Grant S. Heffelfinger1*, Anthony Martino2, Andrey Gorin3, Ying Xu10,3, Mark D. Rintoul1, Al Geist3, Matthew Ennis1, Hashimi Al-Hashimi8, Nikita Arnold3, Andrei Borziak3, Bianca Brahamsha6, Andrea Belgrano12, Praveen Chandramohan3, Xin Chen9, Pan Chongle3, Paul Crozier1, PguongAn Dam10, George S. Davidson1, Robert Day3, Jean Loup Faulon2, Damian Gessler12, Arlene Gonzalez2, David Haaland1, William Hart1, Victor Havin3, Tao Jiang9, Howland Jones1, David Jung3, Ramya Krishnamurthy3, Yooli Light2, Shawn Martin1, Rajesh Munavalli3, Vijaya Natarajan3, Victor Olman10, Frank Olken4, Brian Palenik6, Byung Park3, Steven Plimpton1, Diana Roe2, Nagiza Samatova3, Arie Shoshani4, Michael Sinclair1, Alex Slepoy1, Shawn Stevens8, Chris Stork1, Charlie Strauss5, Zhengchang Su10, Edward Thomas1, Jerilyn A. Timlin1, Xiufeng Wan11, HongWei Wu10, Dong Xu11, Gong-Xin Yu3, Grover Yip8, Zhaoduo Zhang2, Erik Zuiderweg8 *Author to whom correspondence should be addressed (gsheffe%40sandia.gov) 1. Sandia National Laboratories, Albuquerque, NM 2. Sandia National Laboratories, Livermore, CA 3. Oak Ridge National Laboratory, Oak Ridge, TN 4. Lawrence Berkeley National Laboratory, Berkeley, CA 5. Los Alamos National Laboratory, Los Alamos, NM 6. University of California, San Diego 7. University of Illinois, Urbana/Champaign 8. University of Michigan, Ann Arbor 9. University of California, Riverside 10. University of Georgia, Athens 11. University of Missouri, Columbia 12. National Center for Genome Resources, Santa Fe, NM Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Abstract not provided.
Local topological modification is widely used to improve mesh quality after automatic generation of tetrahedral and quadrilateral meshes. These same techniques are also used to support adaptive refinement of these meshes. In contrast, few methods are known for locally modifying the topology of hexahedral meshes. Most efforts to do this have been based on fixed transition templates or global refinement. In contrast, a dual-based 'pillowing' method has been used which, while local, is still quite restricted in its application, and is typically applied in a template-based fashion. In this presentation, I will describe the generalization of a dual-based approach to the local topological modification of hex meshes and its application to clean up hexahedral meshes. A set of three operations for locally modifying hex mesh topology has been shown to reproduce the so-called 'flipping' operations described by Bern et. al as well as other commonly-used refinement templates. I will describe the implementation of these operators and their application to real meshes. Challenging aspects of this work have included visualization of a hex mesh and its dual (especially for poor-quality meshes); the incremental modification of both the primal (i.e. the mesh) and the dual simultaneously; and the interactive steering of these operations with the goal of improving hex meshes which would otherwise have unacceptable quality. These aspects will be discussed in the context of improving hex meshes generated by curve contraction-based whisker weaving. Application of these techniques for improving other hexahedral mesh types, for example those resulting from tetrahedral subdivision, will also be discussed.
Abstract not provided.
Abstract not provided.
Abstract not provided.