An important step towards realizing the advantages of quantum dots in electro-optic applications is to understand the excitation dependences of optical properties. This paper discusses results obtained using a microscopic theory. The calculations uncovered complicated carrier density and electronic structure influences on absorption, gain and refractive index that can be attributed to a delicate balancing of electronic-structure and many-body effects in a coupled quantum-dot-quantum-well system.
During the 1990's the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, "Individual Plant Examination - External Events" (IPEEE). This effort produced a Fire Risk Assessment methodology for at-power that was used by the majority of US Nuclear Power Plants (NPPs) in support of the IPEEE program and several NPPs oversees. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that these methods require upgrades to support current requirements for Risk-Informed/Performance-Based (RI/PB) applications. In 2001 EPRI and the NRC Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support this new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled "Fire PRA Methodology for Nuclear Power Facilities" which addresses fire risk for at-power operations. This report developed: 1) the process for identification and inclusion of the post-fire Human Failure Events (HFEs), 2) the methodology for assigning quantitative screening values to these HFEs, and 3) the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate Human Error Probabilities (HEPs). However, this document does not describe a methodology to develop these best-estimate HEPs given the PSFs and the fire-related effects. In 2007 EPRI and NRC's RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human error events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This paper will describe the progress to date on the development and testing of the fire HRA methodology, which includes addressing the range of fire procedures used in existing plants, the range of strategies for main control room abandonment, and the potential impact of fire-induced spurious electrical effects on crew performance. In addition to developing a detailed HRA approach, one goal of the project is to develop a fire HRA scoping quantification approach that allows derivation of more realistic HEPs than those in the screening approach from NUREG/CR-6850 (EPRI 1011989), while requiring less analytic resources than a detailed HRA. In this approach, detailed HRA will be used only for the more complex actions that cannot meet the criteria for the scoping approach.
A simple and effective approach is presented to construct coarse spaces for overlapping Schwarz preconditioners. The approach is based on energy minimizing extensions of coarse trace spaces, and can be viewed as a generalization of earlier work by Dryja, Smith, and Widlund. The use of these coarse spaces in overlapping Schwarz preconditioners leads to condition numbers bounded by C(1 + H/δ)(1 + log(H/h)) for certain problems when coefficient jumps are aligned with subdomain boundaries. For problems without coefficient jumps, it is possible to remove the log(H/h) factor in this bound by a suitable enrichment of the coarse space. Comparisons are made with the coarse spaces of two other substructuring preconditioners. Numerical examples are also presented for a variety of problems.
Several antennas with integrated high-impedance surfaces are presented. The high-impedance surface is implemented as a composite right/left-handed (CRLH) metamaterial fabricated from a periodic structure characterized by a substrate, filled with an array of vertical vias and capped by capacitive patches. Omnidirectional antennas placed in close proximity to the high-impedance surface radiate hemispherically with an increase in boresight far-field pattern gain of up to 10 dB and a front-to-back ratio as high as 13 dB at 2.45 GHz. Several TEM rectangular horn antennas are realized by replacing conductor walls with high-impedance surfaces. The TEM horn antennas are capable of operating below the TE{sub 1,0} cutoff frequency of a standard all-metal horn antenna, enabling a reduction in antenna volume. Above the cutoff frequency the TEM horn antennas function similarly to standard rectangular horn antennas.
2008 BRIMS Conference - Behavior Representation in Modeling and Simulation
Best, Bradley J.; Dixon, Kevin R.; Speed, Ann; Fleetwood, Michael D.
This paper outlines a comparison between two cognitive modeling frameworks: Atomic Components of Thought - Rational (ACT-R; Anderson & Lebiere, 1998) and a framework under development at Sandia National Laboratories. Both frameworks are based on the cognitive psychological literature, although they represent different theoretical perspectives on cognition, with ACT-R being a production-rule-based system and the Sandia framework being a dynamical-systems or connectionist-type approach. This comparison involved a complex driving domain in which both the car being driven and the driver were equipped with sensors that provided information to each framework. The output of each framework was a classification of the real-world situation that the driver was in, e.g., being overtaken on the autobahn. Comparisons between the two frameworks included validation against human ratings of the driving situations via videotapes of driving sessions, along with twelve creation and performance metrics regarding the method and ease of framework population, processor requirements, and maximum real-time data sampling rate.
Verdict is a collection of subroutines for evaluating the geometric qualities of triangles, quadrilaterals, tetrahedra, and hexahedra using a variety of functions. A quality is a real number assigned to one of these shapes depending on its particular vertex coordinates. These functions are used to evaluate the input to finite element, finite volume, boundary element, and other types of solvers that approximate the solution to partial differential equations defined over regions of space. This article describes the most recent version of Verdict and provides a summary of the main properties of the quality functions offered by the library. It finally demonstrates the versatility and applicability of Verdict by illustrating its use in several scientific applications that pertain to pre, post, and end-to-end processing.
Proceedings of the 16th International Meshing Roundtable, IMR 2007
Parrish, Michael; Borden, Michael; Staten, Matthew; Benzley, Steven
Hexahedral refinement increases the density of an all-hexahedral mesh in a specified region, improving numerical accuracy. Previous research using solely sheet refinement theory made the implementation computationally expensive and unable to effectively handle concave refinement regions and self-intersecting hex sheets. The Selective Approach method is a new procedure that combines two diverse methodologies to create an efficient and robust algorithm able to handle the above stated problems. These two refinement methods are: 1) element by element refinement and 2) directional refinement. In element by element refinement, the three inherent directions of a Hex are refined in one step using one of seven templates. Because of its computational superiority over directional refinement, but its inability to handle concavities, element by element refinement is used in all areas of the specified region except regions local to concavities. The directional refinement scheme refines the three inherent directions of a hexahedron separately on a hex by hex basis. This differs from sheet refinement which refines hexahedra using hex sheets. Directional refinement is able to correctly handle concave refinement regions. A ranking system and propagation scheme allow directional refinement to work within the confines of the Selective Approach Algorithm.
This paper presents methods and applications of sheet insertion in a hexahedral mesh. A hexahedral sheet is dual to a layer of hexahedra in a hexahedral mesh. Because of symmetries within a hexahedral element, every hexahedral mesh can be viewed as a collection of these sheets. It is possible to insert new sheets into an existing mesh, and these new sheets can be used to define new mesh boundaries, refine the mesh, or in some cases can be used to improve quality in an existing mesh. Sheet insertion has a broad range of possible applications including mesh generation, boundary refinement, R-adaptivity and joining existing meshes. Examples of each of these applications are demonstrated.
With the continuing development of more capable data gathering sensors, comes an increased demand on the bandwidth for transmitting larger quantities of data. To help counteract that trend, a study was undertaken to determine appropriate lossy data compression strategies for minimizing their impact on target detection and characterization. The survey of current compression techniques led us to the conclusion that wavelet compression was well suited for this purpose. Wavelet analysis essentially applies a low-pass and high-pass filter to the data, converting the data into the related coefficients that maintain spatial information as well as frequency information. Wavelet compression is achieved by zeroing the coefficients that pertain to the noise in the signal, i.e. the high frequency, low amplitude portion. This approach is well suited for our goal because it reduces the noise in the signal with only minimal impact on the larger, lower frequency target signatures. The resulting coefficients can then be encoded using lossless techniques with higher compression levels because of the lower entropy and significant number of zeros. No significant signal degradation or difficulties in target characterization or detection were observed or measured when wavelet compression was applied to simulated and real data, even when over 80% of the coefficients were zeroed. While the exact level of compression will be data set dependent, for the data sets we studied, compression factors over 10 were found to be satisfactory where conventional lossless techniques achieved levels of less than 3.
This report describes progress in designing a neutral atom trap capable of trapping sub millikelvin atom in a magnetic trap and shuttling the atoms across the atom chip from a collection area to an optical cavity. The numerical simulation and atom chip design are discussed. Also, discussed are preliminary calculations of quantum noise sources in Kerr nonlinear optics measurements based on electromagnetically induced transparency. These types of measurements may be important for quantum nondemolition measurements at the few photon limit.
The Department of Energy (DOE) National Laboratories support the Department of Homeland Security (DHS) in the development and execution of a research and development (R&D) strategy to improve the nation's preparedness against terrorist threats. Current approaches to planning and prioritization of DHS research decisions are informed by risk assessment tools and processes intended to allocate resources to programs that are likely to have the highest payoff. Early applications of such processes have faced challenges in several areas, including characterization of the intelligent adversary and linkage to strategic risk management decisions. The risk-based analysis initiatives at Sandia Laboratories could augment the methodologies currently being applied by the DHS and could support more credible R&D roadmapping for national homeland security programs. Implementation and execution issues facing homeland security R&D initiatives within the national laboratories emerged as a particular concern in this research.
This report provides an assessment of well construction technology for EGS with two primary objectives: 1. Determining the ability of existing technologies to develop EGS wells. 2. Identifying critical well construction research lines and development technologies that are likely to enhance prospects for EGS viability and improve overall economics.
In this report, we present the novel functionality of parallel tetrahedral mesh refinement which we have implemented in MOAB. This report details work done to implement parallel, edge-based, tetrahedral refinement into MOAB. The theoretical basis for this work is contained in [PT04, PT05, TP06] while information on design, performance, and operation specific to MOAB are contained herein. As MOAB is intended mainly for use in pre-processing and simulation (as opposed to the post-processing bent of previous papers), the primary use case is different: rather than refining elements with non-linear basis functions, the goal is to increase the number of degrees of freedom in some region in order to more accurately represent the solution to some system of equations that cannot be solved analytically. Also, MOAB has a unique mesh representation which impacts the algorithm. This introduction contains a brief review of streaming edge-based tetrahedral refinement. The remainder of the report is broken into three sections: design and implementation, performance, and conclusions. Appendix A contains instructions for end users (simulation authors) on how to employ the refiner.
The Advanced Engineering Environment (AEE) is a model for an engineering design and communications system that will enhance project collaboration throughout the nuclear weapons complex (NWC). Sandia National Laboratories and Parametric Technology Corporation (PTC) worked together on a prototype project to evaluate the suitability of a portion of PTC's Windchill 9.0 suite of data management, design and collaboration tools as the basis for an AEE. The AEE project team implemented Windchill 9.0 development servers in both classified and unclassified domains and used them to test and evaluate the Windchill tool suite relative to the needs of the NWC using weapons project use cases. A primary deliverable was the development of a new real time collaborative desktop design and engineering process using PDMLink (data management tool), Pro/Engineer (mechanical computer aided design tool) and ProductView Lite (visualization tool). Additional project activities included evaluations of PTC's electrical computer aided design, visualization, and engineering calculations applications. This report documents the AEE project work to share information and lessons learned with other NWC sites. It also provides PTC with recommendations for improving their products for NWC applications.
Scoping studies have demonstrated that ceragenins, when linked to water-treatment membranes have the potential to create biofouling resistant water-treatment membranes. Ceragenins are synthetically produced molecules that mimic antimicrobial peptides. Evidence includes measurements of CSA-13 prohibiting the growth of and killing planktonic Pseudomonas fluorescens. In addition, imaging of biofilms that were in contact of a ceragenin showed more dead cells relative to live cells than in a biofilm that had not been treated with a ceragenin. This work has demonstrated that ceragenins can be attached to polyamide reverse osmosis (RO) membranes, though work needs to improve the uniformity of the attachment. Finally, methods have been developed to use hyperspectral imaging with multivariate curve resolution to view ceragenins attached to the RO membrane. Future work will be conducted to better attach the ceragenin to the RO membranes and more completely test the biocidal effectiveness of the ceragenins on the membranes.
Shielded special nuclear material (SNM) is very difficult to detect and new technologies are needed to clear alarms and verify the presence of SNM. High-energy photons and neutrons can be used to actively interrogate for heavily shielded SNM, such as highly enriched uranium (HEU), since neutrons can penetrate gamma-ray shielding and gamma-rays can penetrate neutron shielding. Both source particles then induce unique detectable signals from fission. In this LDRD, we explored a new type of interrogation source that uses low-energy proton- or deuteron-induced nuclear reactions to generate high fluxes of mono-energetic gammas or neutrons. Accelerator-based experiments, computational studies, and prototype source tests were performed to obtain a better understanding of (1) the flux requirements, (2) fission-induced signals, background, and interferences, and (3) operational performance of the source. The results of this research led to the development and testing of an axial-type gamma tube source and the design/construction of a high power coaxial-type gamma generator based on the {sup 11}B(p,{gamma}){sup 12}C nuclear reaction.
Cognitive science research investigates the advancement of human cognition and neuroscience capabilities. Addressing risks associated with these advancements can counter potential program failures, legal and ethical issues, constraints to scientific research, and product vulnerabilities. Survey results, focus group discussions, cognitive science experts, and surety researchers concur technical risks exist that could impact cognitive science research in areas such as medicine, privacy, human enhancement, law and policy, military applications, and national security (SAND2006-6895). This SAND report documents a surety engineering framework and a process for identifying cognitive system technical, ethical, legal and societal risks and applying appropriate surety methods to reduce such risks. The framework consists of several models: Specification, Design, Evaluation, Risk, and Maturity. Two detailed case studies are included to illustrate the use of the process and framework. Several Appendices provide detailed information on existing cognitive system architectures; ethical, legal, and societal risk research; surety methods and technologies; and educing information research with a case study vignette. The process and framework provide a model for how cognitive systems research and full-scale product development can apply surety engineering to reduce perceived and actual risks.
A system dynamics model was developed in response to the apparent decline in STEM candidates in the United States and a pending shortage. The model explores the attractiveness of STEM and STEM careers focusing on employers and the workforce. Policies such as boosting STEM literacy, lifting the H-1B visa cap, limiting the offshoring of jobs, and maintaining training are explored as possible solutions. The system is complex, with many feedbacks and long time delays, so solutions that focus on a single point of the system are not effective and cannot solve the problem. A deeper understanding of parts of the system that have not been explored to date is necessary to find a workable solution.