Proceedings of the ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming, PPOPP
Leung, Vitus J.; Bunde, David P.; Ebbers, Johnathan; Feer, Stefan P.; Price, Nickolas W.; Rhodes, Zachary D.; Swank, Matthew
We examine task mapping algorithms for systems that allocate jobs non-contiguously. Several studies have shown that task placement affects job running time. We focus on jobs with a stencil communication pattern and use experiments on a Cray XE to evaluate novel task mapping algorithms as well as some adapted to this setting. This is done with the miniGhost miniApp which mimics the behavior of CTH, a shock physics application. Our strategies improve average and single-run times by as much as 28% and 36% over a baseline strategy, respectively.
One objective of the Climate Science for a Sustainable Energy Future (CSSEF) program is to develop the capability to thoroughly test and understand the uncertainties in the overall climate model and its components as they are being developed. The focus on uncertainties involves sensitivity analysis: the capability to determine which input parameters have a major influence on the output responses of interest. This report presents some initial sensitivity analysis results performed by Lawrence Livermore National Laboratory (LNNL), Sandia National Laboratories (SNL), and Pacific Northwest National Laboratory (PNNL). In the 2011-2012 timeframe, these laboratories worked in collaboration to perform sensitivity analyses of a set of CAM5, 2° runs, where the response metrics of interest were precipitation metrics. The three labs performed their sensitivity analysis (SA) studies separately and then compared results. Overall, the results were quite consistent with each other although the methods used were different. This exercise provided a robustness check of the global sensitivity analysis metrics and identified some strongly influential parameters.
This report summarizes the results of a NEAMS project focused on the use of uncertainty and sensitivity analysis methods within the NEK-5000 and Dakota software framework for assessing failure probabilities as part of probabilistic risk assessment. NEK-5000 is a software tool under development at Argonne National Laboratory to perform computational fluid dynamics calculations for applications such as thermohydraulics of nuclear reactor cores. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. The goal of this work is to demonstrate the use of uncertainty quantification methods in Dakota with NEK-5000.
A theoretical framework for the numerical solution of partial di erential equation (PDE) constrained optimization problems is presented in this report. This theoretical framework embodies the fundamental infrastructure required to e ciently implement and solve this class of problems. Detail derivations of the optimality conditions required to accurately solve several parameter identi cation and optimal control problems are also provided in this report. This will allow the reader to further understand how the theoretical abstraction presented in this report translates to the application.
In recent years, DFT-MD has been shown to be a useful computational tool for exploring the properties of WDM. These calculations achieve excellent agreement with shock compression experiments, which probe the thermodynamic parameters of the Hugoniot state. New X-ray Thomson Scattering diagnostics promise to deliver independent measurements of electronic density and temperature, as well as structural information in shocked systems. However, they require the development of new levels of theory for computing the associated observables within a DFT framework. The experimentally observable x-ray scattering cross section is related to the electronic density-density response function, which is obtainable using TDDFT - a formally exact extension of conventional DFT that describes electron dynamics and excited states. In order to develop a capability for modeling XRTS data and, more generally, to establish a predictive capability for rst principles simulations of matter in extreme conditions, real-time TDDFT with Ehrenfest dynamics has been implemented in an existing PAW code for DFT-MD calculations. The purpose of this report is to record implementation details and benchmarks as the project advances from software development to delivering novel scienti c results. Results range from tests that establish the accuracy, e ciency, and scalability of our implementation, to calculations that are veri ed against accepted results in the literature. Aside from the primary XRTS goal, we identify other more general areas where this new capability will be useful, including stopping power calculations and electron-ion equilibration.
We present results from the Bayesian calibration of hydrological parameters of the Community Land Model (CLM), which is often used in climate simulations and Earth system models. A statistical inverse problem is formulated for three hydrological parameters, conditional on observations of latent heat surface fluxes over 48 months. Our calibration method uses polynomial and Gaussian process surrogates of the CLM, and solves the parameter estimation problem using a Markov chain Monte Carlo sampler. Posterior probability densities for the parameters are developed for two sites with different soil and vegetation covers. Our method also allows us to examine the structural error in CLM under two error models. We find that surrogate models can be created for CLM in most cases. The posterior distributions are more predictive than the default parameter values in CLM. Climatologically averaging the observations does not modify the parameters' distributions significantly. The structural error model reveals a correlation time-scale which can be used to identify the physical process that could be contributing to it. While the calibrated CLM has a higher predictive skill, the calibration is under-dispersive.