Improvised Explosive Device (IED) defeat (IEDD) operations can involve intricate operations that exceed the current capabilities of the grippers on board current bombsquad robots. The Shadow Dexterous Hand from the Shadow Robot Company or 'ShadowHand' for short (www.shadowrobot.com) is the first commercially available robot hand that realistically replicates the motion, degrees-of-freedom and dimensions of a human hand (Figure 1). In this study we evaluate the potential for the ShadowHand to perform potential IED defeat tasks on a mobile platform.
The purpose of this LDRD was to generate data that could be used to populate and thereby reduce the uncertainty in global carbon cycle models. These efforts were focused on developing a system for determining the dissolution rate of biogenic calcite under oceanic pressure and temperature conditions and on carrying out a digital transcriptomic analysis of gene expression in response to changes in pCO2, and the consequent acidification of the growth medium.
We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorized users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.
When estimating parameters for a material model from experimental data collected during a separate effects physics experiment, the quality of fit is only a part of the required data. Also necessary is the uncertainty in the estimated parameters so that uncertainty quantification and model validation can be performed at the full system level. The uncertainty and quality of fit of the data are many times not available and should be considered when fitting the data to a specified model. There are many techniques available to fit data to a material model and a few of them are presented in this work using a simple acoustical emission dataset. The estimated parameters and the affiliated uncertainty will be estimated using a variety of techniques and compared.
The shear webs and laminates of core panels of wind turbine blades must be designed to avoid panel buckling while minimizing blade weight. Typically, buckling resistance is evaluated by consideration of the load-deflection behavior of a blade using finite element analysis (FEA) or full-scale static loading of a blade to failure under a simulated extreme loading condition. This paper examines an alternative means for evaluating blade buckling resistance using non-destructive modal tests or FEA. In addition, panel resonances can be utilized for structural health monitoring by observing changes in the modal parameters of these panel resonances, which are only active in a portion of the blade that is susceptible to failure. Additionally, panel resonances are considered for updating of panel laminate model parameters by correlation with test data. During blade modal tests conducted at Sandia Labs, a series of panel modes with increasing complexity was observed. This paper reports on the findings of these tests, describes potential ways to utilize panel resonances for blade evaluation, health monitoring, and design, and reports recent numerical results to evaluate panel resonances for use in blade structural health assessment.
Through the vehicle of a case study, this paper describes in detail how the guidance found in the suite of IPC (Association Connecting Electronics Industries) publications can be applied to develop a high level of design assurance that flexible printed boards intended for continuous flexing applications will satisfy specified lifetime requirements.
The excitement surrounding the marriage of biosensors and nanotechnology is palpable even from a cursory examination of the scientific literature. Indeed, the word “nano” might be in danger of being overused and reduced to a cliché, although probably essential for publishing papers or securing research funding. The biosensor literature is littered with clever or catchy acronyms, birds being apparently favored (“CANARY”, “SPARROW”), quite apart from “electronic tongue,” “electronic nose,” and so on. Although biosensors have been around since glucose monitors were commercialized in the 1970s, the transition of laboratory research and innumerable research papers on biosensors into the world of commerce has lagged. There are several reasons for this phenomenon including the infamous “valley of death” afflicting entrepreneurs emerging from academic environment into the industrial world, where the rules for success can be radically different. In this context, musings on biosensors and especially nanobiosensors in an open access journal such as Journal of Biosensors and Bioelectronics is topical and appropriate especially since market surveys of biosensors are prohibitively expensive, sometimes running into thousands of dollars for a single copy. The contents and predictions of market share for biosensors in these reports also keep changing every time a report is published. Not only that, the market share projections for biosensors differs considerably amongst various reports. An editorial provides the opportunity to offer personal opinions and perhaps stimulate debate on a particular topic. In this sense, editorials are a departure from the rigor of a research paper. This editorial is no exception. With this preamble, it is worthwhile to stop and ponder the status of commercial biosensors and nanobiosensors.
The electrical power industry is facing the prospect of integrating a significant addition of variable generation technologies in the next several decades, primarily from wind and solar facilities. Overall, transmission and generation reserve levels are decreasing and power system infrastructure in general is aging. To maintain grid reliability modernization and expansion of the power system as well as more optimized use of existing resources will be required. Conventional and pumped storage hydroelectric facilities can provide an increasingly significant contribution to power system reliability by providing energy, capacity and other ancillary services. However, the potential role of hydroelectric power will be affected by another transition that the industry currently experiences - the evolution and expansion of electricity markets. This evolution to market-based acquisition of generation resources and grid management is taking place in a heterogeneous manner. Some North American regions are moving toward full-featured markets while other regions operate without formal markets. Yet other U.S. regions are partially evolved. This report examines the current structure of electric industry acquisition of energy and ancillary services in different regions organized along different structures, reports on the current role of hydroelectric facilities in various regions, and attempts to identify features of market and scheduling areas that either promote or thwart the increased role that hydroelectric power can play in the future. This report is part of a larger effort led by the Electric Power Research Institute with purpose of examining the potential for hydroelectric facilities to play a greater role in balancing the grid in an era of greater penetration of variable renewable energy technologies. Other topics that will be addressed in this larger effort include industry case studies of specific conventional and hydro-electric facilities, systemic operating constraints on hydro-electric resources, and production cost simulations aimed at quantifying the increased role of hydro.
Although interdisciplinary research attracts more and more interest and effort, the benefits of this type of research are not always realized. To understand when expertise diversity will have positive or negative effects on research efforts, we examine how expertise diversity and diversity salience affect task conflict and idea sharing in interdisciplinary research groups. Using data from 148 researchers in 29 academic research labs, we provide evidence on the importance of social categorization states (i.e., expertise diversity salience) in understanding both the information processes (i.e., task conflict) and the creativity processes (i.e., idea sharing) in groups with expertise diversity. We show that expertise diversity can either increase or decrease task conflict depending on the salience of group members' expertise in a curvilinear way: at a medium level of expertise diversity the moderating effect of diversity salience is strongest. Furthermore, enriched group work design can strengthen the benefits of task conflict for creative idea sharing only when expertise diversity salience is low. Finally, we show that idea sharing predicts group performance in interdisciplinary academic research labs over and above task conflict.
The process of rank aggregation is intimately intertwined with the structure of skew-symmetric matrices. We apply recent advances in the theory and algorithms of matrix completion to skew-symmetric matrices. This combination of ideas produces a new method for ranking a set of items. The essence of our idea is that a rank aggregation describes a partially filled skew-symmetric matrix. We extend an algorithm for matrix completion to handle skew-symmetric data and use that to extract ranks for each item. Our algorithm applies to both pairwise comparison and rating data. Because it is based on matrix completion, it is robust to both noise and incomplete data. We show a formal recovery result for the noiseless case and present a detailed study of the algorithm on synthetic data and Netix ratings. Copyright 2011 ACM.
This work presents a methodology based on the concept of error in constitutive equations for the inverse reconstruction of viscoelastic properties using steady-state dynamics. The ECE algorithm presented herein consists of two main steps. In the first step, kinematically admissible strains and dynamically admissible stresses are generated through two auxiliary forward problems. In the second step, a new update of the complex shear and bulk moduli as functions of frequency are obtained by minimizing an ECE functional that measures the discrepancy between the kinematically admissible strains and the dynamically admissible stresses. The feasibility of the methodology is demonstrated through two numerical experiments. It was found that the magnitude and phase of the complex shear modulus can be accurately reconstructed in the presence of noise, while the magnitude of the bulk modulus is more sensitive to noise and can be reconstructed with less accuracy, in general, than the shear modulus. Furthermore, the phase of the bulk modulus, which is related to energy dissipation, can be accurately reconstructed.
There occasionally occur situations in field measurements where direct optical access to the area of interest is not possible. In these cases the borescope is the standard method of imaging. Furthermore, if shape, displacement, or strain are desired in these hidden locations, it would be advantageous to be able to do digital image correlation (DIC) through the borescope. This paper will present the added complexities and errors associated with imaging through a borescope for DIC. Discussion of non-radial distortions and their effects on the measurements, along with a possible correction scheme will be discussed.
A cantilever beam is released from an initial condition. The velocity at the tip is recorded using a laser Doppler vibrometer. The ring-down time history is analyzed using Hilbert transform, which gives the natural frequency and damping. An important issue with the Hilbert transform is vulnerability to noise. The proposed method uses curve fitting to replace some time-differentiation and suppress noise. Linear curve fitting gives very good results for linear beams with low damping. For nonlinear beams with higher damping, polynomial curve fitting captures the time variations. The method was used for estimating quality factors of a few shim metals and PZT bimorphs.
Several commercial computational fluid dynamics (CFD) codes now have the capability to analyze Eulerian two-phase flow using the Rohsenow nucleate boiling model. Analysis of boiling due to one-sided heating in plasma facing components (pfcs) is now receiving attention during the design of water-cooled first wall panels for ITER that may encounter heat fluxes as high as 5 MW/m2. Empirical thermalhydraulic design correlations developed for long fission reactor channels are not reliable when applied to pfcs because fully developed flow conditions seldom exist. Star-CCM+ is one of the commercial CFD codes that can model two-phase flows. Like others, it implements the RPI model for nucleate boiling, but it also seamlessly transitions to a volume-of-fluid model for film boiling. By benchmarking the results of our 3d models against recent experiments on critical heat flux for both smooth rectangular channels and hypervapotrons, we determined the six unique input parameters that accurately characterize the boiling physics for ITER flow conditions under a wide range of absorbed heat flux. We can now exploit this capability to predict the onset of critical heat flux in these components. In addition, the results clearly illustrate the production and transport of vapor and its effect on heat transfer in pfcs from nucleate boiling through transition to film boiling. This article describes the boiling physics implemented in CCM+ and compares the computational results to the benchmark experiments carried out independently in the United States and Russia. Temperature distributions agreed to within 10 °C for a wide range of heat fluxes from 3 MW/m2 to 10 MW/m2 and flow velocities from 1 m/s to 10 m/s in these devices. Although the analysis is incapable of capturing the stochastic nature of critical heat flux (i.e., time and location may depend on a local materials defect or turbulence phenomenon), it is highly reliable in determining the heat flux where boiling instabilities begin to dominate. Beyond this threshold, higher heat fluxes lead to the boiling crisis and eventual burnout. This predictive capability is essential in determining the critical heat flux margin for the design of complex 3d components.
Data have been acquired from a spanwise array of fluctuating wall pressure sensors beneath a wind tunnel wall boundary layer at Mach 2, then invoking Taylor's Hypothesis allows the temporal signals to be converted into a spatial map of the wall pressure field. Improvements to the measurement technique were developed to establish the veracity of earlier tentative conclusions. An adaptive filtering scheme using a reference sensor was implemented to cancel effects of wind tunnel acoustic noise and vibration. Coherent structures in the pressure fields were identified using an improved thresholding algorithm that reduced the occurrence of broken contours and spurious signals. Analog filters with sharper frequency cutoffs than digital filters produced signals of greater spectral purity. Coherent structures were confirmed in the fluctuating wall pressure field that resemble similar structures known to exist in the velocity field, in particular by exhibiting a spanwise meander and merging of events. However, the pressure data lacked the common spanwise alternation of positive and negative events found in velocity data, and conversely demonstrated a weak positive correlation in the spanwise direction.
International Defense and Homeland Security Simulation Workshop, DHSS 2011, Held at the International Mediterranean and Latin American Modeling Multiconference, I3M 2011
In the present paper the act of learner reflection during training with an adaptive or predictive computer-based tutor is considered a learner-system interaction. Incorporating reflection and real-time evaluation of peer performance into adaptive and predictive computerbased tutoring can support the development of automated adaptation. Allowing learners to refine and inform student models from reflective practice with independent open learner models may improve overall accuracy and relevancy. Given the emphasis on selfdirected peer learning with adaptive technology, learner and instructor modeling research continue to be critical research areas for education and training technology.
The chemical industry is one of the largest industries in the United States and a vital contributor to global chemical supply chains. The U.S. Department of Homeland Security (DHS) Science and Technology Directorate has tasked Sandia National Laboratories (Sandia) with developing an analytical capability to assess interdependencies and complexities of the nation's critical infrastructures on and with the chemical sector. This work is being performed to expand the infrastructure analytical capabilities of the National Infrastructure Simulation and Analysis Center (NISAC). To address this need, Sandia has focused on development of an agent-based methodology towards simulating the domestic chemical supply chain and determining economic impacts resulting from large-scale disruptions to the chemical sector. Modeling the chemical supply chain is unique because the flow of goods and services are guided by process thermodynamics and reaction kinetics. Sandia has integrated an agent-based microeconomic simulation tool N-ABLETM with various chemical industry datasets to abstract the chemical supply chain behavior. An enterprise design within N-ABLETM consists of a collection of firms within a supply chain network; each firm interacts with others through chemical reactions, markets, and physical infrastructure. The supply and demand within each simulated network must be consistent with respect to mass balances of every chemical within the network. Production decisions at every time step are a set of constrained linear program (LP) solutions that minimize the difference between desired and actual outputs. We illustrate the methodology with examples of modeled petrochemical supply chains under an earthquake event. The supply chain impacts of upstream and downstream chemicals associated with organic intermediates after a short-term shutdown in the affected area are discussed.
Vertically aligned InGaN/GaN nanorod light emitting diode (LED) arrays were created from planar LED structures using a new top-down fabrication technique consisting of a plasma etch followed by an anisotropic wet etch. The wet etch results in straight, smooth, well-faceted nanorods with controllable diameters and removes the plasma etch damage. 94% of the nanorod LEDs are dislocation-free and a reduced quantum confined Stark effect is observed due to reduced piezoelectric fields. Despite these advantages, the IQE of the nanorod LEDs measured by photoluminescence is comparable to the planar LED, perhaps due to inefficient thermal transport and enhanced nonradiative surface recombination.
We examine several methods to create a sheet of magnesium oxide (MgO) macroporous ceramic material via tape casting. These methods include the approach pioneered by Akartuna et al.1 in which an oil/water emulsion is stabilized by surface-modified metal oxide particles at the droplet interfaces. Upon drying, a scaffold of the self-assembled particles is strong enough to be removed from the substrate material and sintered. We find that this method can be used with MgO particles surface modified by short amphiphilic molecules. This approach is compared with two more traditional methods to induce structure into a green ceramic: 1) creation of an MgO ceramic slip with added pore formers, and 2) sponge impregnation of a reticulated foam with the MgO slip. Green and sintered samples made using each method are hardness tested and results compared for several densities of the final ceramics. Optical and SEM images of the materials are shown.
Low temperature transport properties of high mobility two-dimensional electron systems placed in a weak perpendicular magnetic field can be modified dramatically by microwave or dc electric fields. This paper surveys recent experimental developments which include zero-differential resistance states, Hall field-induced resistance oscillations in tilted magnetic fields, nonlinear response of the Shubnikov-de Haas Oscillations, and a novel microwave photoconductivity peak near the second harmonic of the cyclotron resonance.
Wind energy research activities at Sandia National Laboratories focus on developing large rotors that are lighter and more cost-effective than those designed with current technologies. Because gravity scales as the cube of the blade length, gravity loads become a constraining design factor for very large blades. Efforts to passively reduce turbulent loading has shown significant potential to reduce blade weight and capture more energy. Research in passive load reduction for wind turbines began at Sandia in the late 1990's and has moved from analytical studies to blade applications. This paper discusses the test results of two Sandia prototype research blades that incorporate load reduction techniques. The TX-100 is a 9-m long blade that induces bend-twist coupling with the use of off-axis carbon in the skin. The STAR blade is a 27-m long blade that induces bend-twist coupling by sweeping the blade in a geometric fashion.
Laser-induced incandescence measurements have recently been obtained from 10% and 30% toluene in methanol blended fuel pool fires of 2-m diameter. Calibration of the instrument was performed using an ethylene/air laminar diffusion flame produced by a Santoro-type burner which allowed the extraction of absolute soot-volume-fractions from these images. Performance of the optical probe was characterized using the laminar diffusion flame and corrections were implemented for signal dependence upon detector gain, flat field, and location within the probe laser sheet when processing the images. Probability density functions of the soot-volume fraction were constructed for the blended fuels used in this study and the mean values were determined to be 0.0077 and 0.028 ppm for the 10% and 30% blended fuels, respectively. Signal trapping was estimated for the two types of blended fuel and it was determined to be negligible for the ∼10% toluene/methanol blend and require ∼10% correction for the 30% toluene/methanol blend.
This article presents a generalized analysis on the significance of Galilean invariance in compressible flow computations with stabilized and variational multi-scale methods. The understanding of the key issues and the development of general approaches to Galilean-invariant stabilization are facilitated by the use of a matrix-operator description of Galilean transformations. The analysis of invariance for discontinuity capturing operators is also included. Published in 2010 by John Wiley & Sons, Ltd. This article is a U.S. Government work and is in the public domain in the U.S.A. Published in 2010 by John Wiley & Sons, Ltd.
Deformation bands in high porosity sandstone are an important geological feature for geologists and petroleum engineers; however, formation of these bands is not fully understood. The theoretical framework for deformation band formation in high porosity geomaterials is well established. It suggests that the intermediate principal stress influences the predicted deformation band type; however, these predictions have yet to be fully validated through experiments. Therefore, this study investigates the influence of the intermediate principal stress on failure and the formation of deformation bands in Castlegate sandstone. Mean stresses for these tests range from 30 to 150 MPa, covering brittle to ductile behavior. Deformation band orientations are measured with external observation as well as through acoustic emission locations. Results of experiments conducted at Lode angles of 30 and 14.5 degrees show trends that qualitatively agree with localization theory. The band angle (between the band normal and maximum compression) decreases with increasing mean stress. For tests at the same mean stress, band angle decreases with increasing Lode angle. Copyright 2010 ARMA, American Rock Mechanics Association.
There is a long history of testing crushed salt as backfill for the Waste Isolation Pilot Plant program, but testing was typically done at 100°C or less. Future applications may involve backfilling crushed salt around heat-generating waste packages, where near-field temperatures could reach 250°C or hotter. A series of experiments were conducted to investigate the effects of hydrostatic stress on run-of-mine salt at temperatures up to 250°C and pressures to 20 MPa. The results of these tests were compared with analogous modeling results. By comparing the modeling results at elevated temperatures to the experimental results, the adequacy of the current crushed salt reconsolidation model was evaluated. The model and experimental results both show an increase in the reconsolidation rate with temperature. The current crushed salt model predicts the experimental results well at a temperature of 100°C and matches the overall trends, but over-predicts the temperature dependence of the reconsolidation. Further development of the deformation mechanism activation energies would lead to a better prediction of the temperature dependence by the crushed salt reconsolidation model. Copyright 2010 ARMA, American Rock Mechanics Association.
Proceedings of SPIE - The International Society for Optical Engineering
Anderson, Betty L.; Ho, James G.; Cowan, William D.; Spahn, Olga B.; Yi, Allen Y.; Flannery, Martin R.; Rowe, Delton J.; McCray, David L.; Rabb, David J.; Chen, Peter
We have demonstrated the ability to control the microstructure of PETN films deposited using physical vapor deposition by altering the interface between the film and substrate. Evolution of surface morphology, average density, and surface roughness with film thickness were characterized using surface profilometry and scanning electron microscopy. While films on all of the substrates investigated showed a trend toward a lower average density with increasing film thickness, there were significant variations in density, pore size, and surface morphology in films deposited on different substrates.
Density Functional Theory (DFT) has over the last few years emerged as an indispensable tool for understanding the behavior of matter under extreme conditions. DFT based molecular dynamics simulations (MD) have for example confirmed experimental findings for shocked deuterium,1 enabled the first experimental evidence for a triple point in carbon above 850 GPa,2 and amended experimental data for constructing a global equation of state (EOS) for water, carrying implications for planetary physics.3 The ability to perform high-fidelity calculations is even more important for cases where experiments are impossible to perform, dangerous, and/or prohibitively expensive. For solid explosives, and other molecular crystals, similar success has been severely hampered by an inability of describing the materials at equilibrium. The binding mechanism of molecular crystals (van der Waals' forces) is not well described within traditional DFT.4 Among widely used exchange-correlation functionals, neither LDA nor PBE balances the strong intra-molecular chemical bonding and the weak inter-molecular attraction, resulting in incorrect equilibrium density, negatively affecting the construction of EOS for undetonated high explosives. We are exploring a way of bypassing this problem by using the new Armiento-Mattsson 2005 (AM05) exchange-correlation functional.5, 6 The AM05 functional is highly accurate for a wide range of solids,4, 7 in particular in compression.8 In addition, AM05 does not include any van der Waals' attraction,4 which can be advantageous compared to other functionals: Correcting for a fictitious van der Waals' like attraction with unknown origin can be harder than correcting for a complete absence of all types of van der Waals' attraction. We will show examples from other materials systems where van der Waals' attraction plays a key role, where this scheme has worked well,9 and discuss preliminary results for molecular crystals and explosives.
Material control and accounting (MC&A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC&A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC&A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC&A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.
Compaction waves in porous energetic materials have been shown to induce reaction under impact loading. In the past, simple two-state burn models such as the Arrhenius Burn model have been developed to predict slapper initiation in Hexanitrostilbene (HNS) pellets; however, a more sophisticated, fundamental approach is needed to predict the shock response during impact loading, especially in pellets that have been shown to have strong density gradients. The intergranular stress measures the resistance to bed compaction or the removal of void space due to particle packing and rearrangement. A constitutive model for the intergranular stress is needed for closure in the Baer-Nunziato (BN) multiphase mixture theory for reactive energetic materials. The intergranular stress was obtained from both quasi-static compaction experiments and from dynamic compaction experiments. Additionally, historical data and more recently acquired data for porous pellets compacted to high densities under shock loading were used for model assessment. Predicted particle velocity profiles under dynamic compaction were generally in good agreement with the experimental data. Hence, a multiphase model of HNS has been developed to extend current predictive capability.
In the International HRA Empirical Study, human reliability analysis (HRA) method predictions for human failure events (HFEs) in steam generator tube rupture and loss of feedwater scenarios were compared against the performance of real crews in a nuclear power plant control room simulator. The comparisons examined both the qualitative and quantitative HRA method predictions. This paper discusses some of the lessons learned about HRA methods that have been identified to date. General strengths and weaknesses of HRA methods are addressed, along with the reasons for any limitations in the predictive results produced by the methods. However, the discussions of the lessons learned in this paper must be considered a "snapshot." While most of the data has been analyzed, more detailed analysis of the results from specific HRA methods are ongoing and additional information may emerge.
There has been significant debate in the literature about the role of strain-induced martensite in hydrogen-Assisted fracture of metastable austenitic stainless steels. It is clear that α'-martensite is not necessary for hydrogen-Assisted fracture since hydrogen affects the tensile ductility and fracture properties of stable austenitic stainless steels. Martensite, however, is believed to facilitate hydrogen transport in austenitic stainless steel and numerous studies propose that martensite contributes to fracture. Yet conclusive evidence that strain-induced α'-martensite plays an important mechanistic role on fracture processes in the presence of hydrogen has not been clearly articulated in the literature. In this study, we report microstructural evidence suggesting that α'-martensite does not play a primary role in hydrogen-Assisted fracture during tensile testing of metastable austenitic stainless steel. This microstructural evidence also suggests that thermal twin boundaries are susceptible sites for hydrogen-Assisted fracture.
The U.S. Nuclear Regulatory Commission, in concert with industry, continues to explore the effects of fire on electrical cable and control circuit performance. The latest efforts, which are currently underway, are exploring issues related to fire-induced cable failure modes and effects for direct current (dc) powered electrical control circuits. An extensive series of small and intermediate scale fire tests has been performed. Each test induced electrical failure in copper conductor cables of various types typical of those used by the U.S. commercial nuclear power industry. The cables in each test were connected to one of several surrogate dc control circuits designed to monitor and detect cable electrical failure modes and effects. The tested dc control circuits included two sets of reversing dc motor starters typical of those used in motor-operated valve (MOV) circuits, two small solenoid-operated valves (SOV), one intermediate size (1-inch (25.4mm) diameter) SOV, a very large direct-acting valve coil, and a switchgear/breaker unit. Also included was a specialized test circuit designed specifically to monitor for electrical shorts between two cables (inter-cable shorting). Each of these circuits was powered from a nominal 125V battery bank comprised of 60 individual battery cells (nominal 2V lead-acid type cells with plates made from a lead-cadmium alloy). The total available short circuit current at the terminals of the battery bank was estimated at 13,000A. All of the planned tests have been completed with the data analysis and reporting currently being completed. This paper will briefly describe the test program, some of the preliminary test insights, and planned follow-on activities.
10th International Conference on Probabilistic Safety Assessment and Management 2010, PSAM 2010
Boring, Ronald L.; Forester, John A.; Bye, Andreas; Dang, Vinh N.; Lois, Erasmia
The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to "translate" the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.
10th International Conference on Probabilistic Safety Assessment and Management 2010, PSAM 2010
Dang, Vinh N.; Massaiu, Salvatore; Bye, Andreas; Forester, John A.
In the International HRA Empirical Study, diverse Human Reliability Analysis (HRA) methods are assessed based on data from a dedicated simulator study, which examined the performance of licensed crews in nuclear power plant emergency scenarios. The HRA method assessments involve comparing the predictions obtained with the method with empirical reference data, in quantitative as well as qualitative terms. This paper discusses the assessment approach and criteria, the quantitative reference data, and the comparisons that use these data. Consistent with the expectations at the outset of the study, the statistical limitations of the data are a key issue. These limitations preclude concentrating solely on the failure counts defined by the Human Failure Event (HFE) success criteria and the failure probabilities based on these counts. In assessing quantitative predictive power, this study additionally uses a reference HFE difficulty (qualitative failure likelihood) ranking that accounts for qualitative observations in addition to the failure counts. Overall, the method assessment prioritizes qualitative comparisons, using the rich set of data collected on performance issues. Here, the quantitative predictions and data are used to determine the essential qualitative comparisons, demonstrating how quantitative and qualitative comparisons and criteria can be usefully combined in HRA method assessment.
10th International Conference on Probabilistic Safety Assessment and Management 2010, PSAM 2010
Dang, Vinh N.; Forester, John A.; Mosleh, Ali
The Office of Nuclear Regulatory Research (RES) of the U.S. Nuclear Regulatory Commission is sponsoring work in response to a Staff Requirements Memorandum (SRM) directing an effort to establish a single human reliability analysis (HRA) method for the agency or guidance for the use of multiple methods. One motivation is the variability in Human Failure Event (HFE) probabilities estimated by different analysts and methods. This work considers that a reduction of the variability in the HRA quantification outputs must address three sources: differences in the scope and implementation of qualitative analysis, the qualitative output-quantitative input interface, and the diversity of algorithms for estimating failure probabilities from these inputs. Two companion papers (Mosleh et al. and Hendrickson et al.) describe a proposed qualitative analysis approach The development of the corresponding quantification approach considers a number of alternatives including a module-based hybrid method and a data-driven quantification scheme. This paper presents on-going work and the views of the contributors.
Cyber security analysis tools are necessary to evaluate the security, reliability, and resilience of networked information systems against cyber attack. It is common practice in modern cyber security analysis to separately utilize real systems computers, routers, switches, firewalls, computer emulations (e.g., virtual machines) and simulation models to analyze the interplay between cyber threats and safeguards. In contrast, Sandia National Laboratories has developed new methods to combine these evaluation platforms into a cyber Live, Virtual, and Constructive (LVC) testbed. The combination of real, emulated, and simulated components enables the analysis of security features and components of a networked information system. When performing cyber security analysis on a target system, it is critical to represent realistically the subject security components in high fidelity. In some experiments, the security component may be the actual hardware and software with all the surrounding components represented in simulation or with surrogate devices. Sandia National Laboratories has developed a cyber LVC testbed that combines modeling and simulation capabilities with virtual machines and real devices to represent, in varying fidelity, secure networked information system architectures and devices. Using this capability, secure networked information system architectures can be represented in our testbed on a single computing platform. This provides an "experiment-in-a-box" capability. The result is rapidly produced, large scale, relatively low-cost, multi-fidelity representations of networked information systems. These representations enable analysts to quickly investigate cyber threats and test protection approaches and configurations.
The launch of nuclear materials requires special care to minimize the risk of adverse effects to human health and the environment. This paper describes the special sources of risk that are inherent to the launch of radioactive materials and provides insights into the analysis and control of these risks that have been gained through the experience of previous US launches. Historically, launch safety has been achieved by eliminating, to the greatest degree possible, the potential for energetic insults to affect the radioactive material. For those insults that cannot be precluded, designers minimize the likelihood, magnitude and duration of their interaction with the material. Finally, when a radioactive release cannot be precluded, designers limit the magnitude and spatial extent of its dispersal.
Phillips, Stan D.; Moen, Kurt A.; Najafizadeh, Laleh; Diestelhorst, Ryan M.; Sutton, Akil K.; Cressler, John D.; Vizkelethy, Gyorgy; Dodd, Paul E.; Marshall, Paul W.
Human Reliability Analysis (HRA) methods have been developed primarily to provide information for use in probabilistic risk assessments analyzing nuclear power plant (NPP) operations. Despite this historical focus on the control room, there has been growing interest in applying HRA methods to other NPP activities such as dry cask storage operations (DCSOs) in which spent fuel is transferred into dry cask storage systems. This paper describes a successful application of aspects of the "A Technique for Human Event Analysis" (ATHEANA) HRA approach [1, 2] in performing qualitative HRA activities that generated insights on the potential for dropping a spent fuel cask during DCSOs. This paper provides a description of the process followed during the analysis, a description of the human failure event (HFE) scenario groupings, discussion of inferred human performance vulnerabilities, a detailed examination of one HFE scenario and illustrative approaches for avoiding or mitigating human performance vulnerabilities that may contribute to dropping a spent fuel cask.