Publications

Results 76201–76300 of 99,299

Search results

Jump to search filters

optical properties of semiconductor quantum dots

2008 IEEE PhotonicsGlobal at Singapore, IPGC 2008

Chow, Weng W.

An important step towards realizing the advantages of quantum dots in electro-optic applications is to understand the excitation dependences of optical properties. This paper discusses results obtained using a microscopic theory. The calculations uncovered complicated carrier density and electronic structure influences on absorption, gain and refractive index that can be attributed to a delicate balancing of electronic-structure and many-body effects in a coupled quantum-dot-quantum-well system.

More Details

Influence of surface morphology on the wettability of microstructured ZnO-based surfaces

Journal of Physical Chemistry C

Piech, Martin; Sounart, Thomas L.; Liu, Jun

The effect of sample microstructure on water dynamic wetting behavior was examined for superhydrophobic ZnO films. Surface morphology ranging from needle arrays to overlapping platelets was controlled through judicious choice of hydrothermal reaction conditions. Structure modification with alkyl and perfluoroalkyl chains yielded films characterized by advancing contact angles that ranged from 159° to 171°. Contact angle hysteresis was less than 2° with needles (tip diameter <30 nm) and less than 11° for rods (diameter <250 nm). Relatively thick (diameter ∼600 nm) structures were still characterized by advancing contact angles exceeding 165° and hysteresis <30°. Formation of nanometer-scale roughness on top of the microstructure via silica deposition significantly enhanced the surface superhydrophobicity. Similarly, following perfluoro-alkane treatment, all examined microstructures exhibited advancing contact angles > 169° and hysteresis < 7°. © 2008 American Chemical Society.

More Details

EPRI/NRC fire human reliability analysis guidelines

American Nuclear Society - International Topical Meeting on Probabilistic Safety Assessment and Analysis, PSA 2008

Cooper, Susan E.; Hill, Kendra; Julius, Jeff; Grobbelaar, Jan; Kohlhepp, Kaydee; Forester, John A.; Hendrickson, Stacey M.; Hannaman, Bill; Najafi, Bijan

During the 1990's the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, "Individual Plant Examination - External Events" (IPEEE). This effort produced a Fire Risk Assessment methodology for at-power that was used by the majority of US Nuclear Power Plants (NPPs) in support of the IPEEE program and several NPPs oversees. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that these methods require upgrades to support current requirements for Risk-Informed/Performance-Based (RI/PB) applications. In 2001 EPRI and the NRC Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support this new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled "Fire PRA Methodology for Nuclear Power Facilities" which addresses fire risk for at-power operations. This report developed: 1) the process for identification and inclusion of the post-fire Human Failure Events (HFEs), 2) the methodology for assigning quantitative screening values to these HFEs, and 3) the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate Human Error Probabilities (HEPs). However, this document does not describe a methodology to develop these best-estimate HEPs given the PSFs and the fire-related effects. In 2007 EPRI and NRC's RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human error events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This paper will describe the progress to date on the development and testing of the fire HRA methodology, which includes addressing the range of fire procedures used in existing plants, the range of strategies for main control room abandonment, and the potential impact of fire-induced spurious electrical effects on crew performance. In addition to developing a detailed HRA approach, one goal of the project is to develop a fire HRA scoping quantification approach that allows derivation of more realistic HEPs than those in the screening approach from NUREG/CR-6850 (EPRI 1011989), while requiring less analytic resources than a detailed HRA. In this approach, detailed HRA will be used only for the more complex actions that cannot meet the criteria for the scoping approach.

More Details

Controlling across complex networks - Emerging links between networks and control

Annual Reviews in Control

Clauset, A.; Tanner, H.G.; Abdallah, C.T.; Byrne, R.H.

More Details

A family of energy minimizing coarse spaces for overlapping schwarz preconditioners

Lecture Notes in Computational Science and Engineering

Dohrmann, Clark R.; Klawonn, Axel; Widlund, Olof B.

A simple and effective approach is presented to construct coarse spaces for overlapping Schwarz preconditioners. The approach is based on energy minimizing extensions of coarse trace spaces, and can be viewed as a generalization of earlier work by Dryja, Smith, and Widlund. The use of these coarse spaces in overlapping Schwarz preconditioners leads to condition numbers bounded by C(1 + H/δ)(1 + log(H/h)) for certain problems when coefficient jumps are aligned with subdomain boundaries. For problems without coefficient jumps, it is possible to remove the log(H/h) factor in this bound by a suitable enrichment of the coarse space. Comparisons are made with the coarse spaces of two other substructuring preconditioners. Numerical examples are also presented for a variety of problems.

More Details

Strip transect sampling to estimate object abundance in homogeneous and non-homogeneous poisson fields: A simulation study of the effects of changing transect width and number

Progress in Geomathematics

Coburn, Timothy C.; Mckenna, Sean A.; Saito, Hirotaka

This paper investigates the use of strip transect sampling to estimate object abundance when the underlying spatial distribution is assumed to be Poisson. A design-rather than model-based approach to estimation is investigated through computer simulation, with both homogeneous and non-homogeneous fields representing individual realizations of spatial point processes being considered. Of particular interest are the effects of changing the number of transects and transect width (or alternatively, coverage percent or fraction) on the quality of the estimate. A specific application to the characterization of unexploded ordnance (UXO) in the subsurface at former military firing ranges is discussed. The results may be extended to the investigation of outcrop characteristics as well as subsurface geological features. © 2008 Springer-Verlag Berlin Heidelberg.

More Details

Reduced-Volume horn antennas with integrated high-impedance electromagnetic surfaces

Proceedings of 2008 Asia Pacific Microwave Conference, APMC 2008

Forman, Michael F.

Several antennas with integrated high-impedance surfaces are presented. The high-impedance surface is implemented as a composite right/left-handed (CRLH) metamaterial fabricated from a periodic structure characterized by a substrate, filled with an array of vertical vias and capped by capacitive patches. Omnidirectional antennas placed in close proximity to the high-impedance surface radiate hemispherically with an increase in boresight far-field pattern gain of up to 10 dB and a front-to-back ratio as high as 13 dB at 2.45 GHz. Several TEM rectangular horn antennas are realized by replacing conductor walls with high-impedance surfaces. The TEM horn antennas are capable of operating below the TE{sub 1,0} cutoff frequency of a standard all-metal horn antenna, enabling a reduction in antenna volume. Above the cutoff frequency the TEM horn antennas function similarly to standard rectangular horn antennas.

More Details

Modeling an unstructured driving domain: A comparison of two cognitive frameworks

2008 BRIMS Conference - Behavior Representation in Modeling and Simulation

Best, Bradley J.; Dixon, Kevin R.; Speed, Ann; Fleetwood, Michael D.

This paper outlines a comparison between two cognitive modeling frameworks: Atomic Components of Thought - Rational (ACT-R; Anderson & Lebiere, 1998) and a framework under development at Sandia National Laboratories. Both frameworks are based on the cognitive psychological literature, although they represent different theoretical perspectives on cognition, with ACT-R being a production-rule-based system and the Sandia framework being a dynamical-systems or connectionist-type approach. This comparison involved a complex driving domain in which both the car being driven and the driver were equipped with sensors that provided information to each framework. The output of each framework was a classification of the real-world situation that the driver was in, e.g., being overtaken on the autobahn. Comparisons between the two frameworks included validation against human ratings of the driving situations via videotapes of driving sessions, along with twelve creation and performance metrics regarding the method and ease of framework population, processor requirements, and maximum real-time data sampling rate.

More Details

Discussion tracking in enron email using PARAFAC

Survey of Text Mining II: Clustering, Classification, and Retrieval

Bader, Brett W.; Berry, Michael W.; Browne, Murray

In this chapter, we apply a nonnegative tensor factorization algorithm to extract and detect meaningful discussions from electronic mail messages for a period of one year. For the publicly released Enron electronic mail collection, we encode a sparse term-author-month array for subsequent three-way factorization using the PARAllel FACtors (or PARAFAC) three-way decomposition first proposed by Harshman. Using nonnegative tensors, we preserve natural data nonnegativity and avoid subtractive basis vector and encoding interactions present in techniques such as principal component analysis. Results in thread detection and interpretation are discussed in the context of published Enron business practices and activities, and benchmarks addressing the computational complexity of our approach are provided. The resulting tensor factorizations can be used to produce Gantt-like charts that can be used to assess the duration, order, and dependencies of focused discussions against the progression of time. © 2008 Springer-Verlag London.

More Details

Finite element solution of optimal control problems arising in semiconductor modeling

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Bochev, Pavel B.; Ridzal, Denis

Optimal design, parameter estimation, and inverse problems arising in the modeling of semiconductor devices lead to optimization problems constrained by systems of PDEs. We study the impact of different state equation discretizations on optimization problems whose objective functionals involve flux terms. Galerkin methods, in which the flux is a derived quantity, are compared with mixed Galerkin discretizations where the flux is approximated directly. Our results show that the latter approach leads to more robust and accurate solutions of the optimization problem, especially for highly heterogeneous materials with large jumps in material properties. © 2008 Springer.

More Details

New applications of the verdict library for standardized mesh verification pre, post, and end-to-end processing

Proceedings of the 16th International Meshing Roundtable, IMR 2007

Pébay, Philippe P.; Thompson, David; Shepherd, Jason F.; Knupp, Patrick K.; Lisle, Curtis; Magnotta, Vincent A.; Grosland, Nicole M.

Verdict is a collection of subroutines for evaluating the geometric qualities of triangles, quadrilaterals, tetrahedra, and hexahedra using a variety of functions. A quality is a real number assigned to one of these shapes depending on its particular vertex coordinates. These functions are used to evaluate the input to finite element, finite volume, boundary element, and other types of solvers that approximate the solution to partial differential equations defined over regions of space. This article describes the most recent version of Verdict and provides a summary of the main properties of the quality functions offered by the library. It finally demonstrates the versatility and applicability of Verdict by illustrating its use in several scientific applications that pertain to pre, post, and end-to-end processing.

More Details

A selective approach to conformal refinement of unstructured hexahedral finite element meshes

Proceedings of the 16th International Meshing Roundtable, IMR 2007

Parrish, Michael; Borden, Michael; Staten, Matthew; Benzley, Steven

Hexahedral refinement increases the density of an all-hexahedral mesh in a specified region, improving numerical accuracy. Previous research using solely sheet refinement theory made the implementation computationally expensive and unable to effectively handle concave refinement regions and self-intersecting hex sheets. The Selective Approach method is a new procedure that combines two diverse methodologies to create an efficient and robust algorithm able to handle the above stated problems. These two refinement methods are: 1) element by element refinement and 2) directional refinement. In element by element refinement, the three inherent directions of a Hex are refined in one step using one of seven templates. Because of its computational superiority over directional refinement, but its inability to handle concavities, element by element refinement is used in all areas of the specified region except regions local to concavities. The directional refinement scheme refines the three inherent directions of a hexahedron separately on a hex by hex basis. This differs from sheet refinement which refines hexahedra using hex sheets. Directional refinement is able to correctly handle concave refinement regions. A ranking system and propagation scheme allow directional refinement to work within the confines of the Selective Approach Algorithm.

More Details

Methods and applications of generalized sheet insertion for hexahedral meshing

Proceedings of the 16th International Meshing Roundtable, IMR 2007

Merkley, Karl; Ernst, Corey; Shepherd, Jason F.; Borden, Michael J.

This paper presents methods and applications of sheet insertion in a hexahedral mesh. A hexahedral sheet is dual to a layer of hexahedra in a hexahedral mesh. Because of symmetries within a hexahedral element, every hexahedral mesh can be viewed as a collection of these sheets. It is possible to insert new sheets into an existing mesh, and these new sheets can be used to define new mesh boundaries, refine the mesh, or in some cases can be used to improve quality in an existing mesh. Sheet insertion has a broad range of possible applications including mesh generation, boundary refinement, R-adaptivity and joining existing meshes. Examples of each of these applications are demonstrated.

More Details

Performance of a pulsed ion beam with a renewable cryogenically cooled ion source

Laser and Particle Beams

Renk, T.J.; Mann, Gregory A.; Torres, G.A.

For operation of an ion source in an intense ion beam diode, it is desirable to form a localized and robust source of high purity. A cryogenically operated ion source has great promise, since the ions are formed from a condensed high-purity gas, which has been confined to a relatively thin ice layer on the anode surface. Previous experiments have established the principles of operation of such an ion source, but have been limited in repetitive duration due to the use of short-lived liquid He cooling of the anode surface. We detail here the successful development of a Cryo-Diode in which the cooling was achieved with a closed-cycle cryo-pump. This results in an ion source design that can potentially be operated for an indefinite duration. Time-of-flight measurements with Faraday cups indicate that the resultant ion beam is of high-purity, and composed of singly charged ions formed out of the gas frozen out on the anode surface. © 2008 Copyright Cambridge University Press 2008.

More Details

Comparison of laboratory-scale solute transport visualization experiments with numerical simulation using cross-bedded sandstone

Advances in Water Resources

Tidwell, Vincent C.; Mckenna, Sean A.

Using a slab of Massillon Sandstone, laboratory-scale solute tracer experiments were carried out to test numerical simulations using the Advection-Dispersion Equation (ADE). While studies of a similar nature exist, our work differs in that we combine: (1) experimentation in naturally complex geologic media, (2) X-ray absorption imaging to visualize and quantify two-dimensional solute transport, (3) high resolution transport property characterization, with (4) numerical simulation. The simulations use permeability, porosity, and solute concentration measured to sub-centimeter resolution. While bulk breakthrough curve characteristics were adequately matched, large discrepancies exist between the experimental and simulated solute concentration fields. Investigation of potential experimental errors suggests that the failure to fit solute concentration fields may lie in loss of intricate connectivity within the cross-bedded sandstone occurring at scales finer than our property characterization measurements (i.e., sub-centimeter). © 2008 Elsevier Ltd. All rights reserved.

More Details

Application specific compression : final report

Melgaard, David K.; Lewis, Phillip; Lee, David S.; Carlson, Jeffrey; Byrne, Raymond H.; Harrison, Carol D.

With the continuing development of more capable data gathering sensors, comes an increased demand on the bandwidth for transmitting larger quantities of data. To help counteract that trend, a study was undertaken to determine appropriate lossy data compression strategies for minimizing their impact on target detection and characterization. The survey of current compression techniques led us to the conclusion that wavelet compression was well suited for this purpose. Wavelet analysis essentially applies a low-pass and high-pass filter to the data, converting the data into the related coefficients that maintain spatial information as well as frequency information. Wavelet compression is achieved by zeroing the coefficients that pertain to the noise in the signal, i.e. the high frequency, low amplitude portion. This approach is well suited for our goal because it reduces the noise in the signal with only minimal impact on the larger, lower frequency target signatures. The resulting coefficients can then be encoded using lossless techniques with higher compression levels because of the lower entropy and significant number of zeros. No significant signal degradation or difficulties in target characterization or detection were observed or measured when wavelet compression was applied to simulated and real data, even when over 80% of the coefficients were zeroed. While the exact level of compression will be data set dependent, for the data sets we studied, compression factors over 10 were found to be satisfactory where conventional lossless techniques achieved levels of less than 3.

More Details

Neutral atom traps

Pack, Michael P.

This report describes progress in designing a neutral atom trap capable of trapping sub millikelvin atom in a magnetic trap and shuttling the atoms across the atom chip from a collection area to an optical cavity. The numerical simulation and atom chip design are discussed. Also, discussed are preliminary calculations of quantum noise sources in Kerr nonlinear optics measurements based on electromagnetically induced transparency. These types of measurements may be important for quantum nondemolition measurements at the few photon limit.

More Details

Homeland security R&D roadmapping : risk-based methodological options

Brandt, Larry D.

The Department of Energy (DOE) National Laboratories support the Department of Homeland Security (DHS) in the development and execution of a research and development (R&D) strategy to improve the nation's preparedness against terrorist threats. Current approaches to planning and prioritization of DHS research decisions are informed by risk assessment tools and processes intended to allocate resources to programs that are likely to have the highest payoff. Early applications of such processes have faced challenges in several areas, including characterization of the intelligent adversary and linkage to strategic risk management decisions. The risk-based analysis initiatives at Sandia Laboratories could augment the methodologies currently being applied by the DHS and could support more credible R&D roadmapping for national homeland security programs. Implementation and execution issues facing homeland security R&D initiatives within the national laboratories emerged as a particular concern in this research.

More Details

Enhanced Geothermal Systems (EGS) Well Construction Technology Evaluation Report

Polsky, Yarom; Knudsen, Steven D.; Raymond, David W.

This report provides an assessment of well construction technology for EGS with two primary objectives: 1. Determining the ability of existing technologies to develop EGS wells. 2. Identifying critical well construction research lines and development technologies that are likely to enhance prospects for EGS viability and improve overall economics.

More Details

Parallel tetrahedral mesh refinement with MOAB

Thompson, David; Pebay, Philippe P.

In this report, we present the novel functionality of parallel tetrahedral mesh refinement which we have implemented in MOAB. This report details work done to implement parallel, edge-based, tetrahedral refinement into MOAB. The theoretical basis for this work is contained in [PT04, PT05, TP06] while information on design, performance, and operation specific to MOAB are contained herein. As MOAB is intended mainly for use in pre-processing and simulation (as opposed to the post-processing bent of previous papers), the primary use case is different: rather than refining elements with non-linear basis functions, the goal is to increase the number of degrees of freedom in some region in order to more accurately represent the solution to some system of equations that cannot be solved analytically. Also, MOAB has a unique mesh representation which impacts the algorithm. This introduction contains a brief review of streaming edge-based tetrahedral refinement. The remainder of the report is broken into three sections: design and implementation, performance, and conclusions. Appendix A contains instructions for end users (simulation authors) on how to employ the refiner.

More Details

Advanced engineering environment collaboration project

Dankiewicz, Robert J.; Dutra, Edward G.; Kiba, Grant W.; Lamph, Jane A.; Marburger, Scot J.

The Advanced Engineering Environment (AEE) is a model for an engineering design and communications system that will enhance project collaboration throughout the nuclear weapons complex (NWC). Sandia National Laboratories and Parametric Technology Corporation (PTC) worked together on a prototype project to evaluate the suitability of a portion of PTC's Windchill 9.0 suite of data management, design and collaboration tools as the basis for an AEE. The AEE project team implemented Windchill 9.0 development servers in both classified and unclassified domains and used them to test and evaluate the Windchill tool suite relative to the needs of the NWC using weapons project use cases. A primary deliverable was the development of a new real time collaborative desktop design and engineering process using PDMLink (data management tool), Pro/Engineer (mechanical computer aided design tool) and ProductView Lite (visualization tool). Additional project activities included evaluations of PTC's electrical computer aided design, visualization, and engineering calculations applications. This report documents the AEE project work to share information and lessons learned with other NWC sites. It also provides PTC with recommendations for improving their products for NWC applications.

More Details

Use of ceragenins to create novel biofouling resistant water-treatment membranes

Altman, Susan J.; Hibbs, Michael; Jones, Howland D.T.; Fellows, Benjamin D.

Scoping studies have demonstrated that ceragenins, when linked to water-treatment membranes have the potential to create biofouling resistant water-treatment membranes. Ceragenins are synthetically produced molecules that mimic antimicrobial peptides. Evidence includes measurements of CSA-13 prohibiting the growth of and killing planktonic Pseudomonas fluorescens. In addition, imaging of biofilms that were in contact of a ceragenin showed more dead cells relative to live cells than in a biofilm that had not been treated with a ceragenin. This work has demonstrated that ceragenins can be attached to polyamide reverse osmosis (RO) membranes, though work needs to improve the uniformity of the attachment. Finally, methods have been developed to use hyperspectral imaging with multivariate curve resolution to view ceragenins attached to the RO membrane. Future work will be conducted to better attach the ceragenin to the RO membranes and more completely test the biocidal effectiveness of the ceragenins on the membranes.

More Details

A dual neutron/gamma source for the Fissmat Inspection for Nuclear Detection (FIND) system

Antolak, Arlyn J.; Doyle, B.L.; King, Michael; Provencio, P.N.; Raber, Thomas

Shielded special nuclear material (SNM) is very difficult to detect and new technologies are needed to clear alarms and verify the presence of SNM. High-energy photons and neutrons can be used to actively interrogate for heavily shielded SNM, such as highly enriched uranium (HEU), since neutrons can penetrate gamma-ray shielding and gamma-rays can penetrate neutron shielding. Both source particles then induce unique detectable signals from fission. In this LDRD, we explored a new type of interrogation source that uses low-energy proton- or deuteron-induced nuclear reactions to generate high fluxes of mono-energetic gammas or neutrons. Accelerator-based experiments, computational studies, and prototype source tests were performed to obtain a better understanding of (1) the flux requirements, (2) fission-induced signals, background, and interferences, and (3) operational performance of the source. The results of this research led to the development and testing of an axial-type gamma tube source and the design/construction of a high power coaxial-type gamma generator based on the {sup 11}B(p,{gamma}){sup 12}C nuclear reaction.

More Details

A surety engineering framework to reduce cognitive systems risks

Peercy, David E.

Cognitive science research investigates the advancement of human cognition and neuroscience capabilities. Addressing risks associated with these advancements can counter potential program failures, legal and ethical issues, constraints to scientific research, and product vulnerabilities. Survey results, focus group discussions, cognitive science experts, and surety researchers concur technical risks exist that could impact cognitive science research in areas such as medicine, privacy, human enhancement, law and policy, military applications, and national security (SAND2006-6895). This SAND report documents a surety engineering framework and a process for identifying cognitive system technical, ethical, legal and societal risks and applying appropriate surety methods to reduce such risks. The framework consists of several models: Specification, Design, Evaluation, Risk, and Maturity. Two detailed case studies are included to illustrate the use of the process and framework. Several Appendices provide detailed information on existing cognitive system architectures; ethical, legal, and societal risk research; surety methods and technologies; and educing information research with a case study vignette. The process and framework provide a model for how cognitive systems research and full-scale product development can apply surety engineering to reduce perceived and actual risks.

More Details

Science, Technology, Engineering, and Mathematics (STEM) career attractiveness system dynamics modeling

Kelic, Andjelka; Zagonel, Aldo A.

A system dynamics model was developed in response to the apparent decline in STEM candidates in the United States and a pending shortage. The model explores the attractiveness of STEM and STEM careers focusing on employers and the workforce. Policies such as boosting STEM literacy, lifting the H-1B visa cap, limiting the offshoring of jobs, and maintaining training are explored as possible solutions. The system is complex, with many feedbacks and long time delays, so solutions that focus on a single point of the system are not effective and cannot solve the problem. A deeper understanding of parts of the system that have not been explored to date is necessary to find a workable solution.

More Details
Results 76201–76300 of 99,299
Results 76201–76300 of 99,299