We present a new method for mapping applications' MPI tasks to cores of a parallel computer such that applications' communication time is reduced. We address the case of sparse node allocation, where the nodes assigned to a job are not necessarily located in a contiguous block nor within close proximity to each other in the network, although our methods generalize to contiguous allocations as well. The goal is to assign tasks to cores so that interdependent tasks are performed by "nearby' cores, thus lowering the distance messages must travel, the amount of congestion in the network, and the overall cost of communication. Our new method applies a geometric partitioning algorithm to both the tasks and the processors, and assigns task parts to the corresponding processor parts. We also present a number of algorithmic optimizations that exploit specific features of the network or application. We show that, for the structured finite difference mini-application MiniGhost, our mapping methods reduced communication time up to 75% relative to MiniGhost's default mapping on 128K cores of a Cray XK7 with sparse allocation. For the atmospheric modeling code E3SM/HOMME, our methods reduced communication time up to 31% on 32K cores of an IBM BlueGene/Q with contiguous allocation.
Single-photon detectors have achieved impressive performance and have led to a number of new scientific discoveries and technological applications. Existing models of photodetectors are semiclassical in that the field-matter interaction is treated perturbatively and time-separated from physical processes in the absorbing matter. An open question is whether a fully quantum detector, whereby the optical field, the optical absorption, and the amplification are considered as one quantum system, could have improved performance. Here we develop a theoretical model of such photodetectors and employ simulations to reveal the critical role played by quantum coherence and amplification backaction in dictating the performance. We show that coherence and backaction lead to trade-offs between detector metrics and also determine optimal system designs through control of the quantum-classical interface. Importantly, we establish the design parameters that result in a ideal photodetector with 100% efficiency, no dark counts, and minimal jitter, thus paving the route for next-generation detectors.
Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclectic treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.
The shock Hugoniot for full-density and porous CeO2 was investigated in the liquid regime using ab initio molecular dynamics (AIMD) simulations with Erpenbeck's approach based on the Rankine-Hugoniot jump conditions. The phase space was sampled by carrying out NVT simulations for isotherms between 6000 and 100 000 K and densities ranging from ρ=2.5 to 20g/cm3. The impact of on-site Coulomb interaction corrections +U on the equation of state (EOS) obtained from AIMD simulations was assessed by direct comparison with results from standard density functional theory simulations. Classical molecular dynamics (CMD) simulations were also performed to model atomic-scale shock compression of larger porous CeO2 models. Results from AIMD and CMD compression simulations compare favorably with Z-machine shock data to 525 GPa and gas-gun data to 109 GPa for porous CeO2 samples. Using results from AIMD simulations, an accurate liquid-regime Mie-Grüneisen EOS was built for CeO2. In addition, a revised multiphase SESAME-Type EOS was constrained using AIMD results and experimental data generated in this work. This study demonstrates the necessity of acquiring data in the porous regime to increase the reliability of existing analytical EOS models.
Sierra/SolidMechanics (Sierra/SM) is a Lagrangian, three-dimensional code for finite element analysis of solids and structures. It provides capabilities for explicit dynamic, implicit quasistatic and dynamic analyses. The explicit dynamics capabilities allow for the efficient and robust solution of models with extensive contact subjected to large, suddenly applied loads. For implicit problems, Sierra/SM uses a multi-level iterative solver, which enables it to effectively solve problems with large deformations, nonlinear material behavior, and contact. Sierra/SM has a versatile library of continuum and structural elements, and a large library of material models. The code is written for parallel computing environments enabling scalable solutions of extremely large problems for both implicit and explicit analyses. It is built on the SIERRA Framework, which facilitates coupling with other SIERRA mechanics codes. This document describes the functionality and input syntax for Sierra/SM.
Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. This article explores the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.
The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.