Publications

Results 7901–7950 of 9,998

Search results

Jump to search filters

Final report for %22High performance computing for advanced national electric power grid modeling and integration of solar generation resources%22, LDRD Project No. 149016

Schoenwald, David A.; Richardson, Bryan T.; Riehm, Andrew C.; Wolfenbarger, Paul W.; Adams, Brian M.; Reno, Matthew J.; Hansen, Clifford H.; Oldfield, Ron A.; Stamp, Jason E.; Stein, Joshua S.; Hoekstra, Robert J.; Nelson, Jeffrey S.; Munoz-Ramos, Karina M.; McLendon, William C.; Russo, Thomas V.; Phillips, Laurence R.

Design and operation of the electric power grid (EPG) relies heavily on computational models. High-fidelity, full-order models are used to study transient phenomena on only a small part of the network. Reduced-order dynamic and power flow models are used when analysis involving thousands of nodes are required due to the computational demands when simulating large numbers of nodes. The level of complexity of the future EPG will dramatically increase due to large-scale deployment of variable renewable generation, active load and distributed generation resources, adaptive protection and control systems, and price-responsive demand. High-fidelity modeling of this future grid will require significant advances in coupled, multi-scale tools and their use on high performance computing (HPC) platforms. This LDRD report demonstrates SNL's capability to apply HPC resources to these 3 tasks: (1) High-fidelity, large-scale modeling of power system dynamics; (2) Statistical assessment of grid security via Monte-Carlo simulations of cyber attacks; and (3) Development of models to predict variability of solar resources at locations where little or no ground-based measurements are available.

More Details

Formulation, analysis and numerical study of an optimization-based conservative interpolation (remap) of scalar fields for arbitrary Lagrangian-Eulerian methods

Journal of Computational Physics

Bochev, Pavel; Ridzal, Denis R.; Scovazzi, Guglielmo S.; Shashkov, Mikhail

We develop and study the high-order conservative and monotone optimization-based remap (OBR) of a scalar conserved quantity (mass) between two close meshes with the same connectivity. The key idea is to phrase remap as a global inequality-constrained optimization problem for mass fluxes between neighboring cells. The objective is to minimize the discrepancy between these fluxes and the given high-order target mass fluxes, subject to constraints that enforce physically motivated bounds on the associated primitive variable (density). In so doing, we separate accuracy considerations, handled by the objective functional, from the enforcement of physical bounds, handled by the constraints. The resulting OBR formulation is applicable to general, unstructured, heterogeneous grids. Under some weak requirements on grid proximity, but not on the cell types, we prove that the OBR algorithm is linearity preserving in one, two and three dimensions. The paper also examines connections between the OBR and the recently proposed flux-corrected remap (FCR), Liska et al. [1]. We show that the FCR solution coincides with the solution of a modified version of OBR (M-OBR), which has the same objective but a simpler set of box constraints derived by using a "worst-case" scenario. Because M-OBR (FCR) has a smaller feasible set, preservation of linearity may be lost and accuracy may suffer for some grid configurations. Our numerical studies confirm this, and show that OBR delivers significant increases in robustness and accuracy. Preliminary efficiency studies of OBR reveal that it is only a factor of 2.1 slower than FCR, but admits 1.5 times larger time steps. © 2011 Elsevier Inc.

More Details

IceT users' guide and reference

Moreland, Kenneth D.

The Image Composition Engine for Tiles (IceT) is a high-performance sort-last parallel rendering library. In addition to providing accelerated rendering for a standard display, IceT provides the unique ability to generate images for tiled displays. The overall resolution of the display may be several times larger than any viewport that may be rendered by a single machine. This document is an overview of the user interface to IceT.

More Details

Coupling strategies for high-speed aeroheating problems

Bova, S.W.

A common purpose for performing an aerodynamic analysis is to calculate the resulting loads on a solid body immersed in the flow. Pressure or heat loads are often of interest for characterizing the structural integrity or thermal survivability of the structure. This document describes two algorithms for tightly coupling the mass, momentum and energy conservation equations for a compressible fluid and the energy conservation equation for heat transfer through a solid. We categorize both approaches as monolithically coupled, where the conservation equations for the fluid and the solid are assembled into a single residual vector. Newton's method is then used to solve the resulting nonlinear system of equations. These approaches are in contrast to other popular coupling schemes such as staggered coupling methods were each discipline is solved individually and loads are passed between as boundary conditions, and demonstrates the viability of the monolithic approach for aeroheating problems.

More Details

Adversary phase change detection using S.O.M. and text data

Speed, Ann S.; Warrender, Christina E.

In this work, we developed a self-organizing map (SOM) technique for using web-based text analysis to forecast when a group is undergoing a phase change. By 'phase change', we mean that an organization has fundamentally shifted attitudes or behaviors. For instance, when ice melts into water, the characteristics of the substance change. A formerly peaceful group may suddenly adopt violence, or a violent organization may unexpectedly agree to a ceasefire. SOM techniques were used to analyze text obtained from organization postings on the world-wide web. Results suggest it may be possible to forecast phase changes, and determine if an example of writing can be attributed to a group of interest.

More Details

Truncated multiGaussian fields and effective conductance of binary media

Mckenna, Sean A.; Ray, Jaideep R.; van Bloemen Waanders, Bart G.

Truncated Gaussian fields provide a flexible model for defining binary media with dispersed (as opposed to layered) inclusions. General properties of excursion sets on these truncated fields are coupled with a distance-based upscaling algorithm and approximations of point process theory to develop an estimation approach for effective conductivity in two-dimensions. Estimation of effective conductivity is derived directly from knowledge of the kernel size used to create the multiGaussian field, defined as the full-width at half maximum (FWHM), the truncation threshold and conductance values of the two modes. Therefore, instantiation of the multiGaussian field is not necessary for estimation of the effective conductance. The critical component of the effective medium approximation developed here is the mean distance between high conductivity inclusions. This mean distance is characterized as a function of the FWHM, the truncation threshold and the ratio of the two modal conductivities. Sensitivity of the resulting effective conductivity to this mean distance is examined for two levels of contrast in the two modal conductances and different FWHM sizes. Results demonstrate that the FWHM is a robust measure of mean travel distance in the background medium. The resulting effective conductivities are accurate when compared to numerical results and results obtained from effective media theory, distance-based upscaling and numerical simulation.

More Details

Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1

Edwards, Harold C.; Arguello, Jose G.; Bartlett, Roscoe B.; Bouchard, Julie F.; Freeze, Geoffrey A.; Knupp, Patrick K.; Schultz, Peter A.; Urbina, Angel U.; Wang, Yifeng

The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

More Details

Real-time individualized training vectors for experiential learning

Fabian, Nathan D.; Glickman, Matthew R.

Military training utilizing serious games or virtual worlds potentially generate data that can be mined to better understand how trainees learn in experiential exercises. Few data mining approaches for deployed military training games exist. Opportunities exist to collect and analyze these data, as well as to construct a full-history learner model. Outcomes discussed in the present document include results from a quasi-experimental research study on military game-based experiential learning, the deployment of an online game for training evidence collection, and results from a proof-of-concept pilot study on the development of individualized training vectors. This Lab Directed Research & Development (LDRD) project leveraged products within projects, such as Titan (Network Grand Challenge), Real-Time Feedback and Evaluation System, (America's Army Adaptive Thinking and Leadership, DARWARS Ambush! NK), and Dynamic Bayesian Networks to investigate whether machine learning capabilities could perform real-time, in-game similarity vectors of learner performance, toward adaptation of content delivery, and quantitative measurement of experiential learning.

More Details
Results 7901–7950 of 9,998
Results 7901–7950 of 9,998