Publications

Results 8501–8525 of 9,998

Search results

Jump to search filters

Current trends in parallel computation and the implications for modeling and optimization

Computer Aided Chemical Engineering

Siirola, John D.

More Details

Finite element solution of optimal control problems arising in semiconductor modeling

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Bochev, Pavel B.; Ridzal, Denis R.

Optimal design, parameter estimation, and inverse problems arising in the modeling of semiconductor devices lead to optimization problems constrained by systems of PDEs. We study the impact of different state equation discretizations on optimization problems whose objective functionals involve flux terms. Galerkin methods, in which the flux is a derived quantity, are compared with mixed Galerkin discretizations where the flux is approximated directly. Our results show that the latter approach leads to more robust and accurate solutions of the optimization problem, especially for highly heterogeneous materials with large jumps in material properties. © 2008 Springer.

More Details

New applications of the verdict library for standardized mesh verification pre, post, and end-to-end processing

Proceedings of the 16th International Meshing Roundtable, IMR 2007

Pébay, Philippe P.; Thompson, David; Shepherd, Jason F.; Knupp, Patrick K.; Lisle, Curtis; Magnotta, Vincent A.; Grosland, Nicole M.

Verdict is a collection of subroutines for evaluating the geometric qualities of triangles, quadrilaterals, tetrahedra, and hexahedra using a variety of functions. A quality is a real number assigned to one of these shapes depending on its particular vertex coordinates. These functions are used to evaluate the input to finite element, finite volume, boundary element, and other types of solvers that approximate the solution to partial differential equations defined over regions of space. This article describes the most recent version of Verdict and provides a summary of the main properties of the quality functions offered by the library. It finally demonstrates the versatility and applicability of Verdict by illustrating its use in several scientific applications that pertain to pre, post, and end-to-end processing.

More Details

pCAMAL: An embarrassingly parallel hexahedral mesh generator

Proceedings of the 16th International Meshing Roundtable, IMR 2007

Pébay, Philippe P.; Stephenson, Michael B.; Fortier, Leslie A.; Owen, Steven J.; Melander, Darryl J.

This paper describes a distributed-memory, embarrassingly parallel hexahedral mesh generator, pCAMAL (parallel CUBIT Adaptive Mesh Algorithm Library). pCAMAL utilizes the sweeping method following a serial step of geometry decomposition conducted in the CUBIT geometry preparation and mesh generation tool. The utility of pCAMAL in generating large meshes is illustrated, and linear speed-up under load-balanced conditions is demonstrated.

More Details

Limited-memory techniques for sensor placement in water distribution networks

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Hart, William E.; Berry, Jonathan W.; Boman, Erik G.; Phillips, Cynthia A.; Riesen, Lee A.; Watson, Jean-Paul W.

The practical utility of optimization technologies is often impacted by factors that reflect how these tools are used in practice, including whether various real-world constraints can be adequately modeled, the sophistication of the analysts applying the optimizer, and related environmental factors (e.g. whether a company is willing to trust predictions from computational models). Other features are less appreciated, but of equal importance in terms of dictating the successful use of optimization. These include the scale of problem instances, which in practice drives the development of approximate solution techniques, and constraints imposed by the target computing platforms. End-users often lack state-of-the-art computers, and thus runtime and memory limitations are often a significant, limiting factor in algorithm design. When coupled with large problem scale, the result is a significant technological challenge. We describe our experience developing and deploying both exact and heuristic algorithms for placing sensors in water distribution networks to mitigate against damage due intentional or accidental introduction of contaminants. The target computing platforms for this application have motivated limited-memory techniques that can optimize large-scale sensor placement problems. © 2008 Springer Berlin Heidelberg.

More Details

Implementing peridynamics within a molecular dynamics code

Computer Physics Communications

Parks, Michael L.; Lehoucq, Richard B.; Plimpton, Steven J.; Silling, Stewart A.

Peridynamics (PD) is a continuum theory that employs a nonlocal model to describe material properties. In this context, nonlocal means that continuum points separated by a finite distance may exert force upon each other. A meshless method results when PD is discretized with material behavior approximated as a collection of interacting particles. This paper describes how PD can be implemented within a molecular dynamics (MD) framework, and provides details of an efficient implementation. This adds a computational mechanics capability to an MD code, enabling simulations at mesoscopic or even macroscopic length and time scales. © 2008 Elsevier B.V.

More Details

Remarks on mesh quality

46th AIAA Aerospace Sciences Meeting and Exhibit

Knupp, Patrick K.

Various aspects of mesh quality are surveyed to clarify the disconnect between the traditional uses of mesh quality metrics within industry and the fact that quality ultimately depends on the solution to the physical problem. Truncation error analysis for ffnite difference methods reveals no clear connection to most traditional mesh quality metrics. Finite element bounds to the interpolation error can be shown, in some cases, to be related to known quality metrics such as the condition number. On the other hand, the use of quality metrics that do not take solution characteristics into account can be valid in certain circumstances, primarily as a means of automatically detecting defective meshes. The use of such metrics when applied to simulations for which quality is highly-dependent on the physical solution is clearly inappropriate. Various ffaws and problems with existing quality metrics are mentioned, along with a discussion on the use of threshold values. In closing, the author advocates the investigation of explicitly-referenced quality metrics as a potential means of bridging the gap between a priori quality metrics and solution-dependent metrics.

More Details

Low-memory Lagrangian relaxation methods for sensor placement in municipal water networks

World Environmental and Water Resources Congress 2008: Ahupua'a - Proceedings of the World Environmental and Water Resources Congress 2008

Berry, Jonathan W.; Boman, Erik G.; Phillips, Cynthia A.; Riesen, Lee A.

Placing sensors in municipal water networks to protect against a set of contamination events is a classic p-median problem for most objectives when we assume that sensors are perfect. Many researchers have proposed exact and approximate solution methods for this p-median formulation. For full-scale networks with large contamination event suites, one must generally rely on heuristic methods to generate solutions. These heuristics provide feasible solutions, but give no quality guarantee relative to the optimal placement. In this paper we apply a Lagrangian relaxation method in order to compute lower bounds on the expected impact of suites of contamination events. In all of our experiments with single objectives, these lower bounds establish that the GRASP local search method generates solutions that are provably optimal to to within a fraction of a percentage point. Our Lagrangian heuristic also provides good solutions itself and requires only a fraction of the memory of GRASP. We conclude by describing two variations of the Lagrangian heuristic: an aggregated version that trades off solution quality for further memory savings, and a multi-objective version which balances objectives with additional goals. © 2008 ASCE.

More Details

Preparing for the aftermath: Using emotional agents in game-based training for disaster response

2008 IEEE Symposium on Computational Intelligence and Games, CIG 2008

Djordjevich Reyna, Donna D.; Xavier, Patrick G.; Bernard, Michael L.; Whetzel, Jonathan H.; Glickman, Matthew R.; Verzi, Stephen J.

Ground Truth, a training game developed by Sandia National Laboratories in partnership with the University of Southern California GamePipe Lab, puts a player in the role of an Incident Commander working with teammate agents to respond to urban threats. These agents simulate certain emotions that a responder may feel during this high-stress situation. We construct psychology-plausible models compliant with the Sandia Human Embodiment and Representation Cognitive Architecture (SHERCA) that are run on the Sandia Cognitive Runtime Engine with Active Memory (SCREAM) software. SCREAM's computational representations for modeling human decision-making combine aspects of ANNs and fuzzy logic networks. This paper gives an overview of Ground Truth and discusses the adaptation of the SHERCA and SCREAM into the game. We include a semiformal descriptionof SCREAM. ©2008 IEEE.

More Details

The TEVA-SPOT toolkit for drinking water contaminant warning system design

World Environmental and Water Resources Congress 2008: Ahupua'a - Proceedings of the World Environmental and Water Resources Congress 2008

Hart, William E.; Berry, Jonathan W.; Boman, Erik G.; Murray, Regan; Phillips, Cynthia A.; Riesen, Lee A.; Watson, Jean-Paul W.

We present the TEVA-SPOT Toolkit, a sensor placement optimization tool developed within the USEPA TEVA program. The TEVA-SPOT Toolkit provides a sensor placement framework that facilitates research in sensor placement optimization and enables the practical application of sensor placement solvers to real-world CWS design applications. This paper provides an overview of its key features, and then illustrates how this tool can be flexibly applied to solve a variety of different types of sensor placement problems. © 2008 ASCE.

More Details

Tolerating the community detection resolution limit with edge weighting

Proposed for publication in the Proceedings of the National Academy of Sciences.

Hendrickson, Bruce A.; Laviolette, Randall A.; Phillips, Cynthia A.; Berry, Jonathan W.

Communities of vertices within a giant network such as the World-Wide-Web are likely to be vastly smaller than the network itself. However, Fortunato and Barthelemy have proved that modularity maximization algorithms for community detection may fail to resolve communities with fewer than {radical} L/2 edges, where L is the number of edges in the entire network. This resolution limit leads modularity maximization algorithms to have notoriously poor accuracy on many real networks. Fortunato and Barthelemy's argument can be extended to networks with weighted edges as well, and we derive this corollary argument. We conclude that weighted modularity algorithms may fail to resolve communities with fewer than {radical} W{epsilon}/2 total edge weight, where W is the total edge weight in the network and {epsilon} is the maximum weight of an inter-community edge. If {epsilon} is small, then small communities can be resolved. Given a weighted or unweighted network, we describe how to derive new edge weights in order to achieve a low {epsilon}, we modify the 'CNM' community detection algorithm to maximize weighted modularity, and show that the resulting algorithm has greatly improved accuracy. In experiments with an emerging community standard benchmark, we find that our simple CNM variant is competitive with the most accurate community detection methods yet proposed.

More Details

Improved parallel data partitioning by nested dissection with applications to information retrieval

Proposed for publication in Parallel Computing.

Boman, Erik G.; Chevalier, Cedric C.

The computational work in many information retrieval and analysis algorithms is based on sparse linear algebra. Sparse matrix-vector multiplication is a common kernel in many of these computations. Thus, an important related combinatorial problem in parallel computing is how to distribute the matrix and the vectors among processors so as to minimize the communication cost. We focus on minimizing the total communication volume while keeping the computation balanced across processes. In [1], the first two authors presented a new 2D partitioning method, the nested dissection partitioning algorithm. In this paper, we improve on that algorithm and show that it is a good option for data partitioning in information retrieval. We also show partitioning time can be substantially reduced by using the SCOTCH software, and quality improves in some cases, too.

More Details

The Xygra gun simulation tool

Garasi, Christopher J.; Robinson, Allen C.; Russo, Thomas V.; Lamppa, Derek C.

Inductive electromagnetic launchers, or coilguns, use discrete solenoidal coils to accelerate a coaxial conductive armature. To date, Sandia has been using an internally developed code, SLINGSHOT, as a point-mass lumped circuit element simulation tool for modeling coilgun behavior for design and verification purposes. This code has shortcomings in terms of accurately modeling gun performance under stressful electromagnetic propulsion environments. To correct for these limitations, it was decided to attempt to closely couple two Sandia simulation codes, Xyce and ALEGRA, to develop a more rigorous simulation capability for demanding launch applications. This report summarizes the modifications made to each respective code and the path forward to completing interfacing between them.

More Details

Distance-avoiding sequences for extremely low-bandwidth authentication

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Collins, Michael J.; Mitchell, Scott A.

We develop a scheme for providing strong cryptographic authentication on a stream of messages which consumes very little bandwidth (as little as one bit per message) and is robust in the presence of dropped messages. Such a scheme should be useful for extremely low-power, low-bandwidth wireless sensor networks and "smart dust" applications. The tradeoffs among security, memory, bandwidth, and tolerance for missing messages give rise to several new optimization problems. We report on experimental results and derive bounds on the performance of the scheme. © 2008 Springer-Verlag Berlin Heidelberg.

More Details

Inexact newton dogleg methods

SIAM Journal on Numerical Analysis

Pawlowski, Roger P.; Simonis, Joseph P.; Walker, Homer F.; Shadid, John N.

The dogleg method is a classical trust-region technique for globalizing Newton's method. While it is widely used in optimization, including large-scale optimization via truncated-Newton approaches, its implementation in general inexact Newton methods for systems of nonlinear equations can be problematic. In this paper, we first outline a very general dogleg method suitable for the general inexact Newton context and provide a global convergence analysis for it. We then discuss certain issues that may arise with the standard dogleg implementational strategy and propose modified strategies that address them. Newton-Krylov methods have provided important motivation for this work, and we conclude with a report on numerical experiments involving a Newton-GMRES dogleg method applied to benchmark CFD problems. © 2008 Society for Industrial and Applied Mathematics.

More Details

Solving elliptic finite element systems in near-linear time with support preconditioners

SIAM Journal on Numerical Analysis

Boman, Erik G.; Hendrickson, Bruce A.; Vavasis, Stephen

We consider linear systems arising from the use of the finite element method for solving scalar linear elliptic problems. Our main result is that these linear systems, which are symmetric and positive semidefinite, are well approximated by symmetric diagonally dominant matrices. Our framework for defining matrix approximation is support theory. Significant graph theoretic work has already been developed in the support framework for preconditioners in the diagonally dominant case, and, in particular, it is known that such systems can be solved with iterative methods in nearly linear time. Thus, our approximation result implies that these graph theoretic techniques can also solve a class of finite element problems in nearly linear time. We show that the support number bounds, which control the number of iterations in the preconditioned iterative solver, depend on mesh quality measures but not on the problem size or shape of the domain. © 2008 Society for Industrial and Applied Mathematics.

More Details

The Arctic as a test case for an assessment of climate impacts on national security

Boslough, Mark B.; Taylor, Mark A.; Zak, Bernard D.; Backus, George A.

The Arctic region is rapidly changing in a way that will affect the rest of the world. Parts of Alaska, western Canada, and Siberia are currently warming at twice the global rate. This warming trend is accelerating permafrost deterioration, coastal erosion, snow and ice loss, and other changes that are a direct consequence of climate change. Climatologists have long understood that changes in the Arctic would be faster and more intense than elsewhere on the planet, but the degree and speed of the changes were underestimated compared to recent observations. Policy makers have not yet had time to examine the latest evidence or appreciate the nature of the consequences. Thus, the abruptness and severity of an unfolding Arctic climate crisis has not been incorporated into long-range planning. The purpose of this report is to briefly review the physical basis for global climate change and Arctic amplification, summarize the ongoing observations, discuss the potential consequences, explain the need for an objective risk assessment, develop scenarios for future change, review existing modeling capabilities and the need for better regional models, and finally to make recommendations for Sandia's future role in preparing our leaders to deal with impacts of Arctic climate change on national security. Accurate and credible regional-scale climate models are still several years in the future, and those models are essential for estimating climate impacts around the globe. This study demonstrates how a scenario-based method may be used to give insights into climate impacts on a regional scale and possible mitigation. Because of our experience in the Arctic and widespread recognition of the Arctic's importance in the Earth climate system we chose the Arctic as a test case for an assessment of climate impacts on national security. Sandia can make a swift and significant contribution by applying modeling and simulation tools with internal collaborations as well as with outside organizations. Because changes in the Arctic environment are happening so rapidly, a successful program will be one that can adapt very quickly to new information as it becomes available, and can provide decision makers with projections on the 1-5 year time scale over which the most disruptive, high-consequence changes are likely to occur. The greatest short-term impact would be to initiate exploratory simulations to discover new emergent and robust phenomena associated with one or more of the following changing systems: Arctic hydrological cycle, sea ice extent, ocean and atmospheric circulation, permafrost deterioration, carbon mobilization, Greenland ice sheet stability, and coastal erosion. Sandia can also contribute to new technology solutions for improved observations in the Arctic, which is currently a data-sparse region. Sensitivity analyses have the potential to identify thresholds which would enable the collaborative development of 'early warning' sensor systems to seek predicted phenomena that might be precursory to major, high-consequence changes. Much of this work will require improved regional climate models and advanced computing capabilities. Socio-economic modeling tools can help define human and national security consequences. Formal uncertainty quantification must be an integral part of any results that emerge from this work.

More Details
Results 8501–8525 of 9,998
Results 8501–8525 of 9,998