Publications

Results 73526–73550 of 99,299

Search results

Jump to search filters

Limitations imposed on fire PRA methods as the result of incomplete and uncertain fire event data

Nowlen, Steven P.

Fire probabilistic risk assessment (PRA) methods utilize data and insights gained from actual fire events in a variety of ways. For example, fire occurrence frequencies, manual fire fighting effectiveness and timing, and the distribution of fire events by fire source and plant location are all based directly on the historical experience base. Other factors are either derived indirectly or supported qualitatively based on insights from the event data. These factors include the general nature and intensity of plant fires, insights into operator performance, and insights into fire growth and damage behaviors. This paper will discuss the potential methodology improvements that could be realized if more complete fire event reporting information were available. Areas that could benefit from more complete event reporting that will be discussed in the paper include fire event frequency analysis, analysis of fire detection and suppression system performance including incipient detection systems, analysis of manual fire fighting performance, treatment of fire growth from incipient stages to fully-involved fires, operator response to fire events, the impact of smoke on plant operations and equipment, and the impact of fire-induced cable failures on plant electrical circuits.

More Details

Unstructured discontinuous Galerkin for seismic inversion

Collis, Samuel S.; Ober, Curtis C.; Van Bloemen Waanders, Bart

This abstract explores the potential advantages of discontinuous Galerkin (DG) methods for the time-domain inversion of media parameters within the earth's interior. In particular, DG methods enable local polynomial refinement to better capture localized geological features within an area of interest while also allowing the use of unstructured meshes that can accurately capture discontinuous material interfaces. This abstract describes our initial findings when using DG methods combined with Runge-Kutta time integration and adjoint-based optimization algorithms for full-waveform inversion. Our initial results suggest that DG methods allow great flexibility in matching the media characteristics (faults, ocean bottom and salt structures) while also providing higher fidelity representations in target regions. Time-domain inversion using discontinuous Galerkin on unstructured meshes and with local polynomial refinement is shown to better capture localized geological features and accurately capture discontinuous-material interfaces. These approaches provide the ability to surgically refine representations in order to improve predicted models for specific geological features. Our future work will entail automated extensions to directly incorporate local refinement and adaptive unstructured meshes within the inversion process.

More Details

Radioactive iodine separations and waste forms development

Krumhansl, James L.; Nenoff, Tina M.; Garino, Terry J.; Rademacher, David X.

Reprocessing nuclear fuel releases gaseous radio-iodine containing compounds which must be captured and stored for prolonged periods. Ag-loaded mordenites are the leading candidate for scavenging both organic and inorganic radioiodine containing compounds directly from reprocessing off gases. Alternately, the principal off-gas contaminant, I2, and I-containing acids HI, HIO3, etc. may be scavenged using caustic soda solutions, which are then treated with bismuth to put the iodine into an insoluble form. Our program is focused on using state-of-the-art materials science technologies to develop materials with high loadings of iodine, plus high long-term mechanical and thermal stability. In particular, we present results from research into two materials areas: (1) zeolite-based separations and glass encapsulation, and (2) in-situ precipitation of Bi-I-O waste forms. Ag-loaded mordenite is either commercially available or can be prepared via a simple Ag+ ion exchange process. Research using an Ag+-loaded Mordenite zeolite (MOR, LZM-5 supplied by UOP Corp.) has revealed that I2 is scavenged in one of three forms, as micron-sized AgI particles, as molecular (AgI)x clusters in the zeolite pores and as elemental I2 vapor. It was found that only a portion of the sorbed iodine is retained after heating at 95o C for three months. Furthermore, we show that even when the Ag-MOR is saturated with I2 vapor only roughly half of the silver reacted to form stable AgI compounds. However, the Iodine can be further retained if the AgI-MOR is then encapsulated into a low temperature glass binder. Follow-on studies are now focused on the sorption and waste form development of Iodine from more complex streams including organo-iodine compounds (CH3I). Bismuth-Iodate layered phases have been prepared from caustic waste stream simulant solutions. They serve as a low cost alternative to ceramics waste forms. Novel compounds have been synthesized and solubility studies have been completed using competing groundwater anions (HCO3-, Cl- and SO42-). Distinct variations in solubility were found that related to the structures of the materials.

More Details

Computational screening of large molecule adsorption by metal-organic frameworks

Greathouse, Jeffery A.; Allendorf, Mark

Grand canonical Monte Carlo simulations were performed to investigate trends in low-pressure adsorption of a broad range of organic molecules by a set of metal-organic frameworks (MOFs). The organic analytes considered here are relevant to applications in chemical detection: small aromatics (o-, m-, and p-xylene), polycyclic aromatic hydrocarbons (naphthalene, anthracene, phenanthrene), explosives (TNT and RDX), and chemical warfare agents (GA and VM). The framework materials included several Zn-MOFs (IRMOFs 1-3, 7, 8), a Cr-MOF (CrMIL-53lp), and a Cu-MOF (HKUST-1). Many of the larger organics were significantly adsorbed by the target MOFs at low pressure, which is consistent with the exceptionally high isosteric heats of adsorption (25 kcal/mol - 60 kcal/mol) for this range of analyte. At a higher loading pressure of 101 kPa, the Zn-MOFs show a much higher volumetric uptake than either CrMIL-53-lp or HKUST-1 for all types of analyte. Within the Zn-MOF series, analyte loading is proportional to free volume, and loading decreases with increasing analyte size due to molecular packing effects. CrMIL-53lp showed the highest adsorption energy for all analytes, suggesting that this material may be suitable for low-level detection of organics.

More Details

Statistical language analysis for automatic exfiltration event detection

Robinson, David G.

This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

More Details

Electron-interface scattering in thin metal films

Hopkins, Patrick E.

Electron-interface scattering during electron-phonon nonequilibrium in thin films creates another pathway for electron system energy loss as characteristic lengths of thin films continue to decrease. As power densities in nanodevices increase, excitations of electrons from sub-conduction-band energy levels will become more probable. These sub-conduction-band electronic excitations significantly affect the material's thermophysical properties. In this work, the effects of d-band electronic excitations are considered in electron energy transfer processes in thin metal films. In thin films with thicknesses less than the electron mean free path, ballistic electron transport leads to electron-interface scattering. The ballistic component of electron transport, leading to electron-interface scattering, is studied by a ballistic-diffusive approximation of the Boltzmann Transport Equation. The effect of d-band excitations on electron-interface energy transfer is analyzed during electron-phonon nonequilibrium after short pulsed laser heating in thin films.

More Details

Executive summary for assessing the near-term risk of climate uncertainty : interdependencies among the U.S. states

Backus, George A.

Policy makers will most likely need to make decisions about climate policy before climate scientists have resolved all relevant uncertainties about the impacts of climate change. This study demonstrates a risk-assessment methodology for evaluating uncertain future climatic conditions. We estimate the impacts of climate change on U.S. state- and national-level economic activity from 2010 to 2050. To understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions to mitigate the course of climate change, we focus on precipitation, one of the most uncertain aspects of future climate change. We use results of the climate-model ensemble from the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) as a proxy for representing climate uncertainty over the next 40 years, map the simulated weather from the climate models hydrologically to the county level to determine the physical consequences on economic activity at the state level, and perform a detailed 70-industry analysis of economic impacts among the interacting lower-48 states. We determine the industry-level contribution to the gross domestic product and employment impacts at the state level, as well as interstate population migration, effects on personal income, and consequences for the U.S. trade balance. We show that the mean or average risk of damage to the U.S. economy from climate change, at the national level, is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs.

More Details

Importance sampling : promises and limitations

Swiler, Laura P.

Importance sampling is an unbiased sampling method used to sample random variables from different densities than originally defined. These importance sampling densities are constructed to pick 'important' values of input random variables to improve the estimation of a statistical response of interest, such as a mean or probability of failure. Conceptually, importance sampling is very attractive: for example one wants to generate more samples in a failure region when estimating failure probabilities. In practice, however, importance sampling can be challenging to implement efficiently, especially in a general framework that will allow solutions for many classes of problems. We are interested in the promises and limitations of importance sampling as applied to computationally expensive finite element simulations which are treated as 'black-box' codes. In this paper, we present a customized importance sampler that is meant to be used after an initial set of Latin Hypercube samples has been taken, to help refine a failure probability estimate. The importance sampling densities are constructed based on kernel density estimators. We examine importance sampling with respect to two main questions: is importance sampling efficient and accurate for situations where we can only afford small numbers of samples? And does importance sampling require the use of surrogate methods to generate a sufficient number of samples so that the importance sampling process does increase the accuracy of the failure probability estimate? We present various case studies to address these questions.

More Details
Results 73526–73550 of 99,299
Results 73526–73550 of 99,299