Publications

Results 40401–40600 of 99,299

Search results

Jump to search filters

National Hurricane Program Metrics Framework

Hernandez, Patricia M.; Endo, Ashley; Burks, Lynne S.; Heimer, Brandon W.; John, Charles J.; Miller, Trisha H.; Teclemariam, Nerayo P.

The need for metrics for planning and response measures was identified as key gap to be addressed in the National Hurricane Program's (NHP) Technology Modernization effort. This document proposes a framework for defining a set of metrics for planning and response that will be implemented in the NHP products of hurricane evacuation studies (HES) and post-storm assessments (PSA). To determine the feasibility of this framework, a survey of current HES and PSAs was carried out followed by and then used to determine if the proposed metrics are currently captured. While there is a wide variety in data availability and detail, the implementation of these metrics is not only feasible but presents an opportunity to improve on current practices. The final implementation of this framework shall require the ongoing feedback from local, state, tribal, and federal stakeholders.

More Details

National Hurricane Program Hurricane Evacuation Study Tool End-User Engagement and Usability Analysis

Hernandez, Patricia M.; Burks, Lynne S.; John, Charles J.; Miller, Trisha H.; Teclemariam, Nerayo P.

The Hurricane Evacuation Study (HES) Tool prototype is a key component of the Federal Emergency Management Agency (FEMA) National Hurricane Program (NHP) Technology Modernization (TM) effort. To ensure the HES Tool captured the necessary capabilities and functionality, engagement with potential end-users and key stakeholders was considered a priority throughout development. Pilot studies with representatives from North Carolina and New York City were done to validate the HES Tool process with their current HES undertaking. These pilot studies let the development of additional capabilities and feedback on the needs of diverse regions. A usability study was carried out with key stakeholders identified by NHP leadership through individualized sessions with identified personnel. The results showed the value of the HES Tool compared to the current process as well as key issues that must be addressed to ensure a final transition.

More Details

Deep Borehole Disposal Concept: Development of Universal Canister Concept of Operations

Rigali, Mark J.; Price, Laura L.

This report documents key elements of the conceptual design for deep borehole disposal of radioactive waste to support the development of a universal canister concept of operations. A universal canister is a canister that is designed to be able to store, transport, and dispose of radioactive waste without the canister having to be reopened to treat or repackage the waste. This report focuses on the conceptual design for disposal of radioactive waste contained in a universal canister in a deep borehole. The general deep borehole disposal concept consists of drilling a borehole into crystalline basement rock to a depth of about 5 km, emplacing WPs in the lower 2 km of the borehole, and sealing and plugging the upper 3 km. Research and development programs for deep borehole disposal have been ongoing for several years in the United States and the United Kingdom; these studies have shown that deep borehole disposal of radioactive waste could be safe, cost effective, and technically feasible. The design concepts described in this report are workable solutions based on expert judgment, and are intended to guide follow-on design activities. Both preclosure and postclosure safety were considered in the development of the reference design concept. The requirements and assumptions that form the basis for the deep borehole disposal concept include WP performance requirements, radiological protection requirements, surface handling and transport requirements, and emplacement requirements. The key features of the reference disposal concept include borehole drilling and construction concepts, WP designs, and waste handling and emplacement concepts. These features are supported by engineering analyses.

More Details

Hurricane Evacuation Study (HES) Process Requirements

Burks, Lynne S.; Hernandez, Patricia M.; Miller, Trisha H.

The Federal Emergency Management Agency's (FEMA) National Hurricane Program (NHP), helps protect communities and residents from hurricane hazards by providing evacuation preparedness technical assistance to State, local, and tribal governments. The Department of Homeland Security (DHS) Science and Technology Directorate (S&T) and FEMA are co- sponsoring an effort to modernize technology components of the NHP, including the Hurricane Evacuation Study (HES), which addresses planning and impact assessments for coastal regions. The current HES process is manual, financially costly, and can take up to several years to complete. To streamline this process, the NHP Technology Modernization Program is developing an automated HES Tool that will reduce the cost and time requirements of the HES process by up to 70%. This document outlines the requirements of the current HES process and explains how the HES Tool can be used to fulfill those requirements. It also contains a detailed list of the modeling and simulation capabilities available within the HES tool.

More Details

Overview of Neutron diagnostic measurements for MagLIF Experiments on the Z Accelerator

Hahn, Kelly; Chandler, Gordon A.; Ruiz, Carlos L.; Cooper, Gary; Gomez, Matthew R.; Slutz, Stephen A.; Sefkow, Adam B.; Sinars, Daniel; Hansen, Stephanie B.; Knapp, P.F.; Schmit, Paul; Harding, Eric H.; Jennings, Christopher A.; Awe, Thomas J.; Geissel, Matthias; Rovang, Dean C.; Torres, Jose; Bur, James A.; Cuneo, Michael E.; Glebov, V.Y.; Harvey-Thompson, Adam J.; Hess, Mark H.; Johns, Owen; Jones, Brent M.; Lamppa, Derek C.; Lash, Joel S.; Martin, Matthew R.; Mcbride, Ryan; Peterson, K.J.; Porter, John L.; Reneker, Joseph; Robertson, G.K.; Rochau, G.A.; Savage, Mark E.; Smith, Ian C.; Styron, Jedediah D.; Vesey, Roger A.

Abstract not provided.

Inferring Netflow Data

Roth, Harrison L.

Netflow data is a specific format for looking at network traffic. With this format, many security tests are easy to run. Cloud Service Providers produce logs that do not contain all of the content that Netflow data contains. There is a possibility that the fields that are in Netflow data but absent in Cloud Service Provider logs can be inferred from the fields that are present in the Cloud Service Provider logs. By inferring the missing data fields, the same test that can be run with Netflow data would be possible to run on the logs produced by Cloud Service Providers. Cloud Service Providers include Amazon Web Services and Box.com. There are multiple different types of Cloud Services, and each provider handles them differently and produces different logs. There are IaaS (meaning infrastructure as a service), Saas (meaning software as a service), and Paas (meaning platform as a service). Each provides a different use for the user.

More Details

Sandia Review of High Bridge Associates Report: Comparison of Plutonium Disposition Alternatives: WIPP Diluted Plutonium Storage and MOX Fuel Irradiation

Shoemaker, Paul E.; Hardin, Ernest; Park, Heeho D.; Brady, Patrick V.; Rechard, Robert P.

The subject report from High Bridge Associates (HBA) was issued on March 2, 2016, in reaction to a U.S. Department of Energy (DOE) program decision to pursue down-blending of surplus Pu and geologic disposal at the Waste Isolation Pilot Plant (WIPP). Sandia National Laboratories was requested by the DOE to review the technical arguments presented in the HBA report. Specifically, this review is organized around three technical topics: criticality safety, radiological release limits, and thermal impacts. Questions raised by the report pertaining to legal and regulatory requirements, safeguards and security, international agreements, and costing of alternatives, are beyond the scope of this review.

More Details

Instrumentation and Data Analysis Supporting the Transient Rod Pneumatic System Design Study at the Annular Core Research Reactor

Arnold, James F.

The Sandia National Laboratories Annular Core Research Reactor (ACRR) has been experiencing intermittent failures of its pneumatically operated transient rod system. The most frequent failures have been to the dashpot rod where it connects to the upper aluminum connecting rod. Within the three systems, Transient Rod (TR) A has the most frequent failures. In order to conduct the pulse operations, the Transient Rods are pneumatically driven to quickly remove "poison" from the reactor core. During these pulse operations increased forces and vibrations are placed on the TR system. In an effort to fix these problems, Sandia's TA-V Nuclear Reactor and Systems Engineering Department (Engineering Department) has started an extensive Transient Rod Pneumatic System Design Study. The goal of this study is to determine the causes of the elevated failure rates in TR A. This necessitates an increased understanding of the dynamics in the TR systems. This report will cover the approach taken and reasoning behind the choices of instrumentation, calibration, data collection and data analysis used to diagnose these failures. In order to determine the types and specifications for the sensors that will be installed, estimates of the TR system performance needed to be determined. Using known and assumed parameters, mechanical and thermodynamic models were developed, and calculations performed to estimate ranges of displacement, speed, and acceleration experienced by the transient rod system. After some preliminary data collection, the thermodynamic models were refined for better accuracy. After installation of the full suite of sensors, data can be collected and analyzed in order to determine the cause of the increased failures.

More Details

An investigation of DTOcean foundation and anchor systems

Gomez, Steven P.; Jensen, Richard P.; Heath, Jason E.

This memo documents the mechanical loading analysis performed on the second set of DTOcean program WP4 foundation and anchor systems submodule design iterations [4]. Finite Element Analysis (FEA) simulations were performed to validate design requirements defined by Python based analytic simulations of the WP4 program Naval Facilities Engineering Command (NAVFAC) tool. This FEA procedure focuses on worst case loading scenarios on shallow gravity foundation and pile anchor designs produced by WP4. These models include a steel casing and steel anchor with soft clay surrounding the steel components respectively.

More Details

TITANS Strategic Intern Pipeline: Best Practices and Resource Guide

Kuykendall, Tommie G.; Porter, Cherriel; Hanselmann, Kathryn D.; Robertson, Kathy

This guide is a compilation of best practices and resources from three of Sandia's leading strategic intern programs which make up the Technical Internships to Advance National Security (TITANS) program. TITANS consist of the Center for Analysis Systems and Applications (CASA), the Center for Cyber Defenders (CCD), and the Monitoring Systems & Technology Intern Center (MSTIC).

More Details

Filtration testing for enhanced performance of Radionuclide Monitoring Stations for Nuclear Treaty Verification

Hubbard, Joshua A.

A study of aerosol filtration was conducted to improve U.S. Radionuclide Monitoring Station (RMS) performance for Nuclear Treaty Verification. The primary objectives of this study were to improve system operability and maintainability, reduce power consumption and operational cost, and reduce baseline radionuclide sensitivity. To meet these goals, Sandia National Laboratories (SNL) studied the performance of alternate filter materials and aerosol collection technologies that could be engineered into U.S. Radionuclide Monitoring Stations. Laboratory-scale filtration experiments were conducted with Filter Material 1, FM1 (current filter), Filter Material 2, FM2, and Filter Material 3, FM3. All three materials employ electrostatically charged filter fibers to enhance nanoparticle collection. FM2 and FM3 both had higher air permeability with respect to FM1 which is advantageous for high volume collection and power savings. Particle pre-charging, a well-established industrial technique used in electrostatic precipitators, was tested to see if electrostatically charging particles prior to filtration could enhance filter performance. We found that particle-pre-charging did enhance aerosol collection efficiencies for materials which would not have otherwise satisfied RMS performance requirements. Laboratory-scale testing indicated it may be possible to reduce the baseline radionuclide sensitivity to approximately 60% of its current value by increasing the volume of air sampled in 24 hours to 2.5 times the current air volume. Improvements to geolocation may also be possible with shorter air samples (e.g., 12 hours). A new methodology was developed at SNL to assess filter performance using established RMS certification procedures. We coined these tests “mid-scale” since they bridged the gap between laboratory-scale and full-scale RMS testing. Four filter specimens were drawn from the exact same atmospheric aerosol. Gamma spectroscopy was used to assess radiological backgrounds due to radon progeny and other naturally occurring radionuclides. Direct comparisons between the four filters allowed SNL to quantify the relative change in baseline sensitivity by altering air flow rate, filter material, and particle-pre charging. Mid-scale results agreed with laboratory-scale results: alterations to RMS configuration (filter, flow, and particle charge) may result in baseline sensitivities approximately 55-60% of their current values. Finally, an assessment of scalability was performed to determine if technical approaches used in laboratory-scale and mid-scale testing could be engineered into full-scale Radionuclide Monitoring Stations. Results suggested that particle-pre-charging is a viable technical approach if reductions in baseline sensitivity or power consumption are desired.

More Details

Radiation Effects Science: Advancing the Frontiers of Science and Engineering

Barbour, J.C.

Recent improvements in the radiation effects science capability at Sandia have achieved the following: Record gamma-ray outputs at HERMES enable qualification for Life Extension Programs (LEPs) and evaluation of electromagnetic pulse (EMP) effects on electric grid elements; Completed blind comparison for model validation of cavity system generated EMP in advanced computer simulations; For the first time, new diagnostics provide a high resolution photon energy spectrum on Z (Grand Challenge LDRD); ACRR completed record number of operations, two years in a row, to support design and qualification; Installed the nation’s largest GHz electromagnetic test chamber for design and qualification.

More Details

NMSBA - Twist Resist - Rotational Exercise Module

Reece, Blake D.; Berger, Jason E.; Guido, Steven F.; Linker, Taylor

This report contains a summary of the work completed to develop a modular, rotational exercise device. In the report are images, diagrams, and explanations of the efforts contributed to the project since its inception. The purpose of this document is to provide a walk-through of the progress on this project, from the initial design concepts to the final design and work done, so that the customer (Twist Resist), or individuals/firms who work on this project in the future will have a springboard of ideas/concepts to work from.

More Details

Convergence Study in Global Sensitivity Analysis

Harmon, Rebecca; Khalil, Mohammad; Najm, Habib N.; Safta, Cosmin

Monte Carlo (MC) sampling is a common method used to randomly sample a range of scenarios. The associated error follows a predictable rate of convergence of $1/\sqrt{N}$, such that quadrupling the sample size halves the error. This method is often employed in performing global sensitivity analysis which computes sensitivity indices, measuring fractional contributions of uncertain model inputs to the total output variance. In this study, several models are used to observe the rate of decay in the MC error in the estimation of the conditional variance, the total variance in the output, and the global sensitivity indices. The purpose is to examine the rate of convergence of the error in existing specialized, albeit MC-based, sampling methods for estimation of the sensitivity indices. It was found that the conditional variances and sensitivity indices all follow the $1/\sqrt{N}$ convergence rate. Future work will test the convergence of observables from more complex models such as ignition time in combustion.

More Details

Simulation of Distributed PV Power Output in Oahu Hawaii

Lave, Matt

Distributed solar photovoltaic (PV) power generation in Oahu has grown rapidly since 2008. For applications such as determining the value of energy storage, it is important to have PV power output timeseries. Since these timeseries of not typically measured, here we produce simulated distributed PV power output for Oahu. Simulated power output is based on (a) satellite-derived solar irradiance, (b) PV permit data by neighborhood, and (c) population data by census block. Permit and population data was used to model locations of distributed PV, and irradiance data was then used to simulate power output. PV power output simulations are presented by sub-neighborhood polygons, neighborhoods, and for the whole island of Oahu. Summary plots of annual PV energy and a sample week timeseries of power output are shown, and a the files containing the entire timeseries are described.

More Details

TDAAPS 2: Acoustic Wave Propagation in Attenuative Moving Media

Preston, Leiph

This report outlines recent enhancements to the TDAAPS algorithm first described by Symons et al., 2005. One of the primary additions to the code is the ability to specify an attenuative media using standard linear fluid mechanisms to match reasonably general frequency versus loss curves, including common frequency versus loss curves for the atmosphere and seawater. Other improvements that will be described are the addition of improved numerical boundary conditions via various forms of Perfectly Matched Layers, enhanced accuracy near high contrast media interfaces, and improved physics options.

More Details

Implementation of Fast Emulator-based Code Calibration

Bowman, Nathaniel; Denman, Matthew R.

Calibration is the process of using experimental data to gain more precise knowledge of simulator inputs. This process commonly involves the use of Markov-chain Monte Carlo, which requires running a simulator thousands of times. If we can create a faster program, called an emulator, that mimics the outputs of the simulator for an input range of interest, then we can speed up the process enough to make it feasible for expensive simulators. To this end, we implement a Gaussian-process emulator capable of reproducing the behavior of various long-running simulators to within acceptable tolerance. This fast emulator can be used in place of a simulator to run Markov-chain Monte Carlo in order to calibrate simulation parameters to experimental data. As a demonstration, this emulator is used to calibrate the inputs of an actual simulator against two sodium-fire experiments.

More Details
Results 40401–40600 of 99,299
Results 40401–40600 of 99,299