Publications

Results 3476–3500 of 9,998

Search results

Jump to search filters

Results and correlations from analyses of the ENSA ENUN 32P cask transport tests

American Society of Mechanical Engineers, Pressure Vessels and Piping Division (Publication) PVP

Kalinina, Elena A.; Gordon, Natalie; Ammerman, Douglas; Uncapher, William L.; Saltzstein, Sylvia J.; Wright, Catherine

An ENUN 32P cask supplied by Equipos Nucleares S.A. (ENSA) was transported 9,600 miles by road, sea, and rail in 2017 in order to collect shock and vibration data on the cask system and surrogate spent fuel assemblies within the cask. The task of examining 101,857 ASCII data files – 6.002 terabytes of data (this includes binary and ASCII files) – has begun. Some results of preliminary analyses are presented in this paper. A total of seventy-seven accelerometers and strain gauges were attached by Sandia National Laboratories (SNL) to three surrogate spent fuel assemblies, the cask basket, the cask body, the transport cradle, and the transport platforms. The assemblies were provided by SNL, Empresa Nacional de Residuos Radiactivos, S.A. (ENRESA), and a collaboration of Korean institutions. The cask system was first subjected to cask handling operations at the ENSA facility. The cask was then transported by heavy-haul truck in northern Spain and shipped from Spain to Belgium and subsequently to Baltimore on two roll-on/roll-off ships. From Baltimore, the cask was transported by rail using a 12- axle railcar to the American Association of Railroads’ Transportation Technology Center, Inc. (TTCI) near Pueblo, Colorado where a series of special rail tests were performed. Data were continuously collected during this entire sequence of multi-modal transportation events. (We did not collect data on the transfer between modes of transportation.) Of particular interest – indeed the original motivation for these tests – are the strains measured on the zirconium-alloy tubes in the assemblies. The strains for each of the transport modes are compared to the yield strength of irradiated Zircaloy to illustrate the margin against rod failure during normal conditions of transport. The accelerometer data provides essential comparisons of the accelerations on the different components of the cask system exhibiting both amplification and attenuation of the accelerations at the transport platforms through the cradle and cask and up to the interior of the cask. These data are essential for modeling cask systems. This paper concentrates on analyses of the testing of the cask on a 12-axle railcar at TTCI.

More Details

Solution Approaches to Stochastic Programming Problems under Endogenous and/or Exogenous Uncertainties

Computer Aided Chemical Engineering

Cremaschi, Selen; Siirola, John D.

Optimization problems under uncertainty involve making decisions without the full knowledge of the impact the decisions will have and before all the facts relevant to those decisions are known. These problems are common, for example, in process synthesis and design, planning and scheduling, supply chain management, and generation and distribution of electric power. The sources of uncertainty in optimization problems fall into two broad categories: endogenous and exogenous. Exogenous uncertain parameters are realized at a known stage (e.g., time period or decision point) in the problem irrespective of the values of the decision variables. For example, demand is generally considered to be independent of any capacity expansion decisions in process industries, and hence, is regarded as an exogenous uncertain parameter. In contrast, decisions impact endogenous uncertain parameters. The impact can either be in the resolution or in the distribution of the uncertain parameter. The realized values of a Type-I endogenous uncertain parameter are affected by the decisions. An example of this type of uncertainty would be facility protection problem where the likelihood of a facility failing to deliver goods or services after a disruptive event depends on the level of resources allocated as protection to that facility. On the other hand, only the realization times of Type-II endogenous uncertain parameters are affected by decisions. For example, in a clinical trial planning problem, whether a clinical trial is successful or not is only realized after the clinical trial has been completed, and whether the clinical trial is successful or not is not impacted by when the clinical trial is started. There are numerous approaches to modelling and solving optimization problems with exogenous and/or endogenous uncertainty, including (adjustable) robust optimization, (approximate) dynamic programming, model predictive control, and stochastic programming. Stochastic programming is a particularly attractive approach, as there is a straightforward translation from the deterministic model to the stochastic equivalent. The challenge with stochastic programming arises through the rapid, sometimes exponential, growth in the program size as we sample the uncertainty space or increase the number of recourse stages. In this talk, we will give an overview of our research activities developing practical stochastic programming approaches to problems with exogeneous and/or endogenous uncertainty. We will highlight several examples from power systems planning and operations, process modelling, synthesis and design optimization, artificial lift infrastructure planning for shale gas production, and clinical trial planning. We will begin by discussing the straightforward case of exogenous uncertainty. In this situation, the stochastic program can be expressed completely by a deterministic model, a scenario tree, and the scenario-specific parameterizations of the deterministic model. Beginning with the deterministic model, modelers create instances of the deterministic model for each scenario using the scenario-specific data. Coupling the scenario models occurs through the addition of nonanticipativity constraints, equating the stage decision variables across all scenarios that pass through the same stage node in the scenario tree. Modelling tools like PySP (Watson, 2012) greatly simplify the process of composing large stochastic programs by beginning either with an abstract representation of the deterministic model written in Pyomo (Hart, et al., 2017) and scenario data, or a function that will return the deterministic Pyomo model for a specific scenario. PySP automatically can create the extensive form (deterministic equivalent) model from a general representation of the scenario tree. The challenge with large scale stochastic programs with exogenous uncertainty arises through managing the growth of the problem size. Fortunately, there are several well-known approaches to decomposing the problem, both stage-wise (e.g., Benders’ decomposition) and scenario-based (e.g., Lagrangian relaxation or Progressive Hedging), enabling the direct solution of stochastic programs with hundreds or thousands of scenarios. We will then discuss developments in modelling and solving stochastic programs with endogenous uncertainty. These problems are significantly more challenging to both pose and to solve, due to the exponential growth in scenarios required to cover the decision-dependent uncertainties relative to the number of stages in the problem. In this situation, standardized frameworks for expressing stochastic programs do not exist, requiring a modeler to explicitly generate the representations and nonanticipativity constraints. Further, the size of the resulting scenario space (frequently exceeding millions of scenarios) precludes the direct solution of the resulting program. In this case, numerous decomposition algorithms and heuristics have been developed (e.g., Lagrangean decomposition-based algorithms (Tarhan, et al. 2013) or Knapsack-based decomposition Algorithms (Christian and Cremaschi, 2015)).

More Details

Time and Frequency Domain Methods for Basis Selection in Random Linear Dynamical Systems

International Journal for Uncertainty Quantification

Jakeman, John D.; Pulch, Roland

Polynomial chaos methods have been extensively used to analyze systems in uncertainty quantification. Furthermore, several approaches exist to determine a low-dimensional approximation (or sparse approximation) for some quantity of interest in a model, where just a few orthogonal basis polynomials are required. In this work, we consider linear dynamical systems consisting of ordinary differential equations with random variables. The aim of this paper is to explore methods for producing low-dimensional approximations of the quantity of interest further. We investigate two numerical techniques to compute a low-dimensional representation, which both fit the approximation to a set of samples in the time domain. On the one hand, a frequency domain analysis of a stochastic Galerkin system yields the selection of the basis polynomials. It follows a linear least squares problem. On the other hand, a sparse minimization yields the choice of the basis polynomials by information from the time domain only. An orthogonal matching pursuit produces an approximate solution of the minimization problem. Finally, we compare the two approaches using a test example from a mechanical application.

More Details

SPARC: Demonstrate burst-buffer-based checkpoint/restart on ATS-1

Oldfield, Ron; Ulmer, Craig; Widener, Patrick; Ward, Harry L.

Recent high-performance computing (HPC) platforms such as the Trinity Advanced Technology System (ATS-1) feature burst buffer resources that can have a dramatic impact on an application’s I/O performance. While these non-volatile memory (NVM) resources provide a new tier in the storage hierarchy, developers must find the right way to incorporate the technology into their applications in order to reap the benefits. Similar to other laboratories, Sandia is actively investigating ways in which these resources can be incorporated into our existing libraries and workflows without burdening our application developers with excessive, platform-specific details. This FY18Q1 milestone summaries our progress in adapting the Sandia Parallel Aerodynamics and Reentry Code (SPARC) in Sandia’s ATDM program to leverage Trinity’s burst buffers for checkpoint/restart operations. We investigated four different approaches with varying tradeoffs in this work: (1) simply updating job script to use stage-in/stage out burst buffer directives, (2) modifying SPARC to use LANL’s hierarchical I/O (HIO) library to store/retrieve checkpoints, (3) updating Sandia’s IOSS library to incorporate the burst buffer in all meshing I/O operations, and (4) modifying SPARC to use our Kelpie distributed memory library to store/retrieve checkpoints. Team members were successful in generating initial implementation for all four approaches, but were unable to obtain performance numbers in time for this report (reasons: initial problem sizes were not large enough to stress I/O, and SPARC refactor will require changes to our code). When we presented our work to the SPARC team, they expressed the most interest in the second and third approaches. The HIO work was favored because it is lightweight, unobtrusive, and should be portable to ATS-2. The IOSS work is seen as a long-term solution, and is favored because all I/O work (including checkpoints) can be deferred to a single library.

More Details
Results 3476–3500 of 9,998
Results 3476–3500 of 9,998