Publications

Results 9176–9200 of 9,998

Search results

Jump to search filters

Substructured multibody molecular dynamics

Crozier, Paul; Grest, Gary S.; Ismail, Ahmed E.; Lehoucq, Rich; Plimpton, Steven J.; Stevens, Mark J.

We have enhanced our parallel molecular dynamics (MD) simulation software LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator, lammps.sandia.gov) to include many new features for accelerated simulation including articulated rigid body dynamics via coupling to the Rensselaer Polytechnic Institute code POEMS (Parallelizable Open-source Efficient Multibody Software). We use new features of the LAMMPS software package to investigate rhodopsin photoisomerization, and water model surface tension and capillary waves at the vapor-liquid interface. Finally, we motivate the recipes of MD for practitioners and researchers in numerical analysis and computational mechanics.

More Details

Analysis of real-time reservoir monitoring : reservoirs, strategies, & modeling

Cooper, Scott P.; Elbring, Gregory J.; Jakaboski, Blake E.; Lorenz, John C.; Mani, Seethambal; Normann, Randy A.; Rightley, Michael J.; Van Bloemen Waanders, Bart; Weiss, Chester J.

The project objective was to detail better ways to assess and exploit intelligent oil and gas field information through improved modeling, sensor technology, and process control to increase ultimate recovery of domestic hydrocarbons. To meet this objective we investigated the use of permanent downhole sensors systems (Smart Wells) whose data is fed real-time into computational reservoir models that are integrated with optimized production control systems. The project utilized a three-pronged approach (1) a value of information analysis to address the economic advantages, (2) reservoir simulation modeling and control optimization to prove the capability, and (3) evaluation of new generation sensor packaging to survive the borehole environment for long periods of time. The Value of Information (VOI) decision tree method was developed and used to assess the economic advantage of using the proposed technology; the VOI demonstrated the increased subsurface resolution through additional sensor data. Our findings show that the VOI studies are a practical means of ascertaining the value associated with a technology, in this case application of sensors to production. The procedure acknowledges the uncertainty in predictions but nevertheless assigns monetary value to the predictions. The best aspect of the procedure is that it builds consensus within interdisciplinary teams The reservoir simulation and modeling aspect of the project was developed to show the capability of exploiting sensor information both for reservoir characterization and to optimize control of the production system. Our findings indicate history matching is improved as more information is added to the objective function, clearly indicating that sensor information can help in reducing the uncertainty associated with reservoir characterization. Additional findings and approaches used are described in detail within the report. The next generation sensors aspect of the project evaluated sensors and packaging survivability issues. Our findings indicate that packaging represents the most significant technical challenge associated with application of sensors in the downhole environment for long periods (5+ years) of time. These issues are described in detail within the report. The impact of successful reservoir monitoring programs and coincident improved reservoir management is measured by the production of additional oil and gas volumes from existing reservoirs, revitalization of nearly depleted reservoirs, possible re-establishment of already abandoned reservoirs, and improved economics for all cases. Smart Well monitoring provides the means to understand how a reservoir process is developing and to provide active reservoir management. At the same time it also provides data for developing high-fidelity simulation models. This work has been a joint effort with Sandia National Laboratories and UT-Austin's Bureau of Economic Geology, Department of Petroleum and Geosystems Engineering, and the Institute of Computational and Engineering Mathematics.

More Details

Beyond the local density approximation : improving density functional theory for high energy density physics applications

Modine, Normand A.; Wright, Alan F.; Muller, Richard P.; Sears, Mark P.; Wills, Ann E.; Desjarlais, Michael P.

A finite temperature version of 'exact-exchange' density functional theory (EXX) has been implemented in Sandia's Socorro code. The method uses the optimized effective potential (OEP) formalism and an efficient gradient-based iterative minimization of the energy. The derivation of the gradient is based on the density matrix, simplifying the extension to finite temperatures. A stand-alone all-electron exact-exchange capability has been developed for testing exact exchange and compatible correlation functionals on small systems. Calculations of eigenvalues for the helium atom, beryllium atom, and the hydrogen molecule are reported, showing excellent agreement with highly converged quantumMonte Carlo calculations. Several approaches to the generation of pseudopotentials for use in EXX calculations have been examined and are discussed. The difficult problem of finding a correlation functional compatible with EXX has been studied and some initial findings are reported.

More Details

Penetrator reliability investigation and design exploration : from conventional design processes to innovative uncertainty-capturing algorithms

Swiler, Laura P.; Hough, Patricia D.; Gray, Genetha A.; Chiesa, Michael L.; Heaphy, Robert T.; Thomas, Stephen W.; Trucano, Timothy G.

This project focused on research and algorithmic development in optimization under uncertainty (OUU) problems driven by earth penetrator (EP) designs. While taking into account uncertainty, we addressed three challenges in current simulation-based engineering design and analysis processes. The first challenge required leveraging small local samples, already constructed by optimization algorithms, to build effective surrogate models. We used Gaussian Process (GP) models to construct these surrogates. We developed two OUU algorithms using 'local' GPs (OUU-LGP) and one OUU algorithm using 'global' GPs (OUU-GGP) that appear competitive or better than current methods. The second challenge was to develop a methodical design process based on multi-resolution, multi-fidelity models. We developed a Multi-Fidelity Bayesian Auto-regressive process (MF-BAP). The third challenge involved the development of tools that are computational feasible and accessible. We created MATLAB{reg_sign} and initial DAKOTA implementations of our algorithms.

More Details

DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

Brown, Shannon L.; Griffin, Joshua D.; Hough, Patricia D.; Kolda, Tamara G.; Martinez-Canales, Monica L.; Williams, Pamela J.; Adams, Brian M.; Dunlavy, Daniel M.; Gay, David M.; Swiler, Laura P.; Giunta, Anthony A.; Hart, William E.; Watson, Jean-Paul; Eddy, John P.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

More Details

Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual

Brown, Shannon L.; Griffin, Joshua D.; Hough, Patricia D.; Kolda, Tamara G.; Martinez-Canales, Monica L.; Williams, Pamela J.; Adams, Brian M.; Dunlavy, Daniel M.; Gay, David M.; Swiler, Laura P.; Giunta, Anthony A.; Hart, William E.; Watson, Jean-Paul; Eddy, John P.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

More Details

On the design of reversible QDCA systems

Frost-Murphy, Sarah E.; Debenedictis, Erik

This work is the first to describe how to go about designing a reversible QDCA system. The design space is substantial, and there are many questions that a designer needs to answer before beginning to design. This document begins to explicate the tradeoffs and assumptions that need to be made and offers a range of approaches as starting points and examples. This design guide is an effective tool for aiding designers in creating the best quality QDCA implementation for a system.

More Details

Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity

Adams, Brian M.; Wittwer, Jonathan W.; Bichon, Barron J.; Carnes, Brian R.; Copps, Kevin D.; Eldred, Michael; Hopkins, Matthew M.; Neckels, David; Notz, Patrick K.; Subia, Samuel R.

This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

More Details

DAKOTA, a multilevel parellel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 uers's manual

Swiler, Laura P.; Giunta, Anthony A.; Hart, William E.; Watson, Jean-Paul; Eddy, John P.; Griffin, Joshua D.; Hough, Patricia D.; Kolda, Tamara G.; Martinez-Canales, Monica L.; Williams, Pamela J.; Eldred, Michael; Brown, Shannon L.; Adams, Brian M.; Dunlavy, Daniel M.; Gay, David M.

The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details
Results 9176–9200 of 9,998
Results 9176–9200 of 9,998