There are several machines in this country that produce short bursts of neutrons for various applications. A few examples are the Zmachine, operated by Sandia National Laboratories in Albuquerque, NM; the OMEGA Laser Facility at the University of Rochester in Rochester, NY; and the National Ignition Facility (NIF) operated by the Department of Energy at Lawrence Livermore National Laboratory in Livermore, California. They all incorporate neutron time of flight (nTOF) detectors which measure neutron yield, and the shapes of the waveforms from these detectors contain germane information about the plasma conditions that produce the neutrons. However, the signals can also be %E2%80%9Cclouded%E2%80%9D by a certain fraction of neutrons that scatter off structural components and also arrive at the detectors, thereby making analysis of the plasma conditions more difficult. These detectors operate in current mode - i.e., they have no discrimination, and all the photomultiplier anode charges are integrated rather than counted individually as they are in single event counting. Up to now, there has not been a method for modeling an nTOF detector operating in current mode. MCNPPoliMiwas developed in 2002 to simulate neutron and gammaray detection in a plastic scintillator, which produces a collision data output table about each neutron and photon interaction occurring within the scintillator; however, the postprocessing code which accompanies MCNPPoliMi assumes a detector operating in singleevent counting mode and not current mode. Therefore, the idea for this work had been born: could a new postprocessing code be written to simulate an nTOF detector operating in current mode? And if so, could this process be used to address such issues as the impact of neutron scattering on the primary signal? Also, could it possibly even identify sources of scattering (i.e., structural materials) that could be removed or modified to produce %E2%80%9Ccleaner%E2%80%9D neutron signals? This process was first developed and then applied to the axial neutron time of flight detectors at the ZFacility mentioned above. First, MCNPPoliMi was used to model relevant portions of the facility between the source and the detector locations. To obtain useful statistics, variance reduction was utilized. Then, the resulting collision output table produced by MCNPPoliMi was further analyzed by a MATLAB postprocessing code. This converted the energy deposited by neutron and photon interactions in the plastic scintillator (i.e., nTOF detector) into light output, in units of MeVee%D1%84 (electron equivalent) vs time. The time response of the detector was then folded into the signal via another MATLAB code. The simulated response was then compared with experimental data and shown to be in good agreement. To address the issue of neutron scattering, an %E2%80%9CIdeal Case,%E2%80%9D (i.e., a plastic scintillator was placed at the same distance from the source for each detector location) with no structural components in the problem. This was done to produce as %E2%80%9Cpure%E2%80%9D a neutron signal as possible. The simulated waveform from this %E2%80%9CIdeal Case%E2%80%9D was then compared with the simulated data from the %E2%80%9CFull Scale%E2%80%9D geometry (i.e., the detector at the same location, but with all the structural materials now included). The %E2%80%9CIdeal Case%E2%80%9D was subtracted from the %E2%80%9CFull Scale%E2%80%9D geometry case, and this was determined to be the contribution due to scattering. The time response was deconvolved out of the empirical data, and the contribution due to scattering was then subtracted out of it. A transformation was then made from dN/dt to dN/dE to obtain neutron spectra at two different detector locations.
The article briefly summarizes the siting history of salt nuclear waste repositories as it relates to the research that has been conducted in support of this overall mission. Project Salt Vault was a solid-waste disposal demonstration in bedded salt performed by Oak Ridge National Laboratory (ORNL) in Lyons, Kan. The US Atomic Energy Commission (AEC) intended to convert the project into a pilot plant for the storage of high-level waste. Despite these intentions, nearby solution mining and questionably plugged oil and gas boreholes resulted in the abandonment of the Lyons site. With help from the USGS, in 1972 ORNL began looking in the Permian Basin for a different disposal site in Texas or New Mexico. The WIPP project was discontinued in 1974 in favor of concentrating efforts on a Retrievable Surface Storage Facility. After the demise of that project in 1975, work resumed on WIPP and its scope was temporarily expanded to include defense HLW. A location a few miles northeast of the current WIPP site was chosen for further study.
This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.
The storage caverns of the US Strategic Petroleum Reserve (SPR) exhibit creep behavior resulting in reduction of storage capacity over time. Maintenance of oil storage capacity requires periodic controlled leaching named remedial leach. The 30 MMB sale in summer 2011 provided space available to facilitate leaching operations. The objective of this report is to present the results and analyses of remedial leach activity at the SPR following the 2011 sale until mid-January 2013. This report focuses on caverns BH101, BH104, WH105 and WH106. Three of the four hanging strings were damaged resulting in deviations from normal leach patterns; however, the deviations did not affect the immediate geomechanical stability of the caverns. Significant leaching occurred in the toes of the caverns likely decreasing the number of available drawdowns until P/D ratio criteria are met. SANSMIC shows good agreement with sonar data and reasonably predicted the location and size of the enhanced leaching region resulting from string breakage.
Laser-driven proton radiography provides electromagnetic field mapping with high spatiotemporal resolution, and has been applied to many laser-driven High Energy Density Physics (HEDP) experiments. Our report addresses key questions about the feasibility of ion radiography at the Z-Accelerator (%E2%80%9CZ%E2%80%9D), concerning laser configuration, hardware, and radiation background. Charged particle tracking revealed that radiography at Z requires GeV scale protons, which is out of reach for existing and near-future laser systems. However, it might be possible to perform proton deflectometry to detect magnetic flux compression in the fringe field region of a magnetized liner inertial fusion experiment. Experiments with the Z-Petawatt laser to enhance proton yield and energy showed an unexpected scaling with target thickness. Full-scale, 3D radiation-hydrodynamics simulations, coupled to fully explicit and kinetic 2D particle-in-cell simulations running for over 10 ps, explain the scaling by a complex interplay of laser prepulse, preplasma, and ps-scale temporal rising edge of the laser.
This report considers and prioritizes the primary potential technical costreduction pathways for offshore wave activated body attenuators designed for ocean resources. This report focuses on technical research and development costreduction pathways related to the device technology rather than environmental monitoring or permitting opportunities. Three sources of information were used to understand current cost drivers and develop a prioritized list of potential costreduction pathways: a literature review of technical work related to attenuators, a reference device compiled from literature sources, and a webinar with each of three industry device developers. Data from these information sources were aggregated and prioritized with respect to the potential impact on the lifetime levelized cost of energy, the potential for progress, the potential for success, and the confidence in success. Results indicate the five most promising costreduction pathways include advanced controls, an optimized structural design, improved power conversion, planned maintenance scheduling, and an optimized device profile.
To benchmark the current U.S. wind turbine fleet reliability performance and identify the major contributors to component-level failures and other downtime events, the Department of Energy funded the development of the Continuous Reliability Enhancement for Wind (CREW) database by Sandia National Laboratories. This report is the third annual Wind Plant Reliability Benchmark, to publically report on CREW findings for the wind industry. The CREW database uses both high resolution Supervisory Control and Data Acquisition (SCADA) data from operating plants and Strategic Power Systems ORAPWindª (Operational Reliability Analysis Program for Wind) data, which consist of downtime and reserve event records and daily summaries of various time categories for each turbine. Together, these data are used as inputs into CREWs reliability modeling. The results presented here include: the primary CREW Benchmark statistics (operational availability, utilization, capacity factor, mean time between events, and mean downtime); time accounting from an availability perspective; time accounting in terms of the combination of wind speed and generation levels; power curve analysis; and top system and component contributors to unavailability.
This report develops and documents nonlinear kinematic relations needed to implement piezoelectric constitutive models in ALEGRA-EMMA [5], where calculations involving large displacements and rotations are routine. Kinematic relationships are established using Gausss law and Faradays law; this presentation on kinematics goes beyond piezoelectric materials and is applicable to all dielectric materials. The report then turns to practical details of implementing piezoelectric models in an application code where material principal axes are rarely aligned with user defined problem coordinate axes. This portion of the report is somewhat pedagogical but is necessary in order to establish documentation for the piezoelectric implementation in ALEGRA-EMMA. This involves transforming elastic, piezoelectric, and permittivity moduli from material principal axes to problem coordinate axes. The report concludes with an overview of the piezoelectric implementation in ALEGRA-EMMA and small verification examples.
Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.
This is the final SAND report for the Early-Career LDRD (# 158477) “Sublinear Algorithms for Massive Data Sets”. We provide a list of the various publications and achievements in the project. 3
This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conducted with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.
Accurate knowledge of thermophysical properties of concrete is considered extremely important for meaningful models to be developed of scenarios wherein the concrete is rapidly heated. Test of solid propellant burns on samples of concrete from Launch Complex 17 of the Cape Canaveral show spallation and fragmentation. In response to the need for accurate modeling scenarios of these observations, an experimental program to determine the permeability and thermal properties of the concrete was developed. Room temperature gas permeability measurements of Launch Complex 17 of the Cape Canaveral concrete dried at 50°C yield permeability estimates of 0.07mD (mean), and thermal properties (thermal conductivity, diffusivity, and specific heat) were found to vary with temperatures from room temperature to 300°C. Thermal conductivity ranges from 1.7-1.9 W/mK at 50°C to 1.0-1.15 W/mK at 300°C, thermal diffusivity ranges from 0.75-0.96 mm2/s at 50°C to 0.44-0.58 mm2/s at 300°C, and specific heat ranges from 1.76-2.32 /m3K to 2.00-2.50 /m3K at 300°C.
The Gamma Detector Response and Analysis Software-Detector Response Function (GADRAS-DRF) application computes the response of gamma-ray detectors to incoming radiation. This manual provides step-by-step procedures to acquaint new users with the use of the application. The capabilities include characterization of detector response parameters, plotting and viewing measured and computed spectra, and analyzing spectra to identify isotopes or to estimate flux profiles. GADRAS-DRF can compute and provide detector responses quickly and accurately, giving researchers and other users the ability to obtain usable results in a timely manner (a matter of seconds or minutes).