With the build-out of large transport networks utilizing optical technologies, more and more capacity is being made available. Innovations in Dense Wave Division Multiplexing (DWDM) and the elimination of optical-electrical-optical conversions have brought on advances in communication speeds as we move into 10 Gigabit Ethernet and above. Of course, there is a need to encrypt data on these optical links as the data traverses public and private network backbones. Unfortunately, as the communications infrastructure becomes increasingly optical, advances in encryption (done electronically) have failed to keep up. This project examines the use of optical logic for implementing encryption in the photonic domain to achieve the requisite encryption rates. In order to realize photonic encryption designs, technology developed for electrical logic circuits must be translated to the photonic regime. This paper examines two classes of all optical logic (SEED, gain competition) and how each discrete logic element can be interconnected and cascaded to form an optical circuit. Because there is no known software that can model these devices at a circuit level, the functionality of the SEED and gain competition devices in an optical circuit were modeled in PSpice. PSpice allows modeling of the macro characteristics of the devices in context of a logic element as opposed to device level computational modeling. By representing light intensity as voltage, 'black box' models are generated that accurately represent the intensity response and logic levels in both technologies. By modeling the behavior at the systems level, one can incorporate systems design tools and a simulation environment to aid in the overall functional design. Each black box model of the SEED or gain competition device takes certain parameters (reflectance, intensity, input response), and models the optical ripple and time delay characteristics. These 'black box' models are interconnected and cascaded in an encrypting/scrambling algorithm based on a study of candidate encryption algorithms. We found that a low gate count, cascadable encryption algorithm is most feasible given device and processing constraints. The modeling and simulation of optical designs using these components is proceeding in parallel with efforts to perfect the physical devices and their interconnect. We have applied these techniques to the development of a 'toy' algorithm that may pave the way for more robust optical algorithms. These design/modeling/simulation techniques are now ready to be applied to larger optical designs in advance of our ability to implement such systems in hardware.
Prokaryotic single-cell microbes are the simplest of all self-sufficient living organisms. Yet microbes create and use much of the molecular machinery present in more complex organisms, and the macro-molecules in microbial cells interact in regulatory, metabolic, and signaling pathways that are prototypical of the reaction networks present in all cells. We have developed a simple simulation model of a prokaryotic cell that treats proteins, protein complexes, and other organic molecules as particles which diffuse via Brownian motion and react with nearby particles in accord with chemical rate equations. The code models protein motion and chemistry within an idealized cellular geometry. It has been used to simulate several simple reaction networks and compared to more idealized models which do not include spatial effects. In this report we describe an initial version of the simulation code that was developed with FY03 funding. We discuss the motivation for the model, highlight its underlying equations, and describe simulations of a 3-stage kinase cascade and a portion of the carbon fixation pathway in the Synechococcus microbe.
Algorithms for higher order accuracy modeling of kinematic behavior within the ALEGRA framework are presented. These techniques improve the behavior of the code when kinematic errors are found, ensure orthonormality of the rotation tensor at each time step, and increase the accuracy of the Lagrangian stretch and rotation tensor update algorithm. The implementation of these improvements in ALEGRA is described. A short discussion of issues related to improving the accuracy of the stress update procedures is also included.
A Simple Removable Epoxy Foam (SREF) decomposition chemistry model has been developed to predict the decomposition behavior of an epoxy foam encapsulant exposed to high temperatures. The foam is composed of an epoxy polymer, blowing agent, and surfactant. The model is based on a simple four-step mass loss model using distributed Arrhenius reaction rates. A single reaction was used to describe desorption of the blowing agent and surfactant (BAS). Three of the reactions were used to describe degradation of the polymer. The coordination number of the polymeric lattice was determined from the chemical structure of the polymer; and a lattice statistics model was used to describe the evolution of polymer fragments. The model lattice was composed of sites connected by octamethylcylotetrasiloxane (OS) bridges, mixed product (MP) bridges, and bisphenol-A (BPA) bridges. The mixed products were treated as a single species, but are likely composed of phenols, cresols, and furan-type products. Eleven species are considered in the SREF model - (1) BAS, (2) OS, (3) MP, (4) BPA, (5) 2-mers, (6) 3-mers, (7) 4-mers, (8) nonvolatile carbon residue, (9) nonvolatile OS residue, (10) L-mers, and (11) XL-mers. The first seven of these species (VLE species) can either be in the condensed-phase or gas-phase as determined by a vapor-liquid equilibrium model based on the Rachford-Rice equation. The last four species always remain in the condensed-phase. The 2-mers, 3-mers, and 4-mers are polymer fragments that contain two, three, or four sites, respectively. The residue can contain C, H, N, O, and/or Si. The L-mer fraction consists of polymer fragments that contain at least five sites (5-mer) up to a user defined maximum mer size. The XL-mer fraction consists of polymer fragments greater than the user specified maximum mer size and can contain the infinite lattice if the bridge population is less than the critical bridge population. Model predictions are compared to 133-thermogravimetric analysis (TGA) experiments performed at 24 different conditions. The average RMS error between the model and the 133 experiments was 4.25%. The model was also used to predict the response of two other removable epoxy foams with different compositions as well as the pressure rise in a constant volume hot cell.
Recyclable transmission lines (RTL)s are being studied as a means to repetitively drive z pinches to generate fusion energy. We have shown previously that the RTL mass can be quite modest. Minimizing the RTL mass reduces recycling costs and the impulse delivered to the first wall of a fusion chamber. Despite this reduction in mass, a few seconds will be needed to reload an RTL after each subsequent shot. This is in comparison to other inertial fusion approaches that expect to fire up to ten capsules per second. Thus a larger fusion yield is needed to compensate for the slower repetition rate in a z-pinch driven fusion reactor. We present preliminary designs of z-pinch driven fusion capsules that provide an adequate yield of 1-4 GJ. We also present numerical simulations of the effect of these fairly large fusion yields on the RTL and the first wall of the reactor chamber. These simulations were performed with and without a neutron absorbing blanket surrounding the fusion explosion. We find that the RTL will be fully vaporized out to a radius of about 3 meters assuming normal incidence. However, at large enough radius the RTL will remain in either the liquid or solid state and this portion of the RTL could fragment and become shrapnel. We show that a dynamic fragmentation theory can be used to estimate the size of these fragmented particles. We discuss how proper design of the RTL can allow this shrapnel to be directed away from the sensitive mechanical parts of the reactor chamber.
A comprehensive settlement of the North Korean nuclear issue may involve military, economic, political, and diplomatic components, many of which will require verification to ensure reciprocal implementation. This paper sets out potential verification methodologies that might address a wide range of objectives. The inspection requirements set by the International Atomic Energy Agency form the foundation, first as defined at the time of the Agreed Framework in 1994, and now as modified by the events since revelation of the North Korean uranium enrichment program in October 2002. In addition, refreezing the reprocessing facility and 5 MWe reactor, taking possession of possible weapons components and destroying weaponization capabilities add many new verification tasks. The paper also considers several measures for the short-term freezing of the North's nuclear weapon program during the process of negotiations, should that process be protracted. New inspection technologies and monitoring tools are applicable to North Korean facilities and may offer improved approaches over those envisioned just a few years ago. These are noted, and potential bilateral and regional verification regimes are examined.
The distributed data problem, is characterized by the desire to bring together semantically related data from syntactically unrelated portions of a term. Two strategic combinators, dynamic and transient, are introduced in the context of a classical strategic programming framework. The impact of the resulting system on instances of the distributed data problem is then explored.
Artificially structured photonic lattice materials are commonly investigated for their unique ability to block and guide light. However, an exciting aspect of photonic lattices which has received relatively little attention is the extremely high refractive index dispersion within the range of frequencies capable of propagating within the photonic lattice material. In fact, it has been proposed that a negative refractive index may be realized with the correct photonic lattice configuration. This report summarizes our investigation, both numerically and experimentally, into the design and performance of such photonic lattice materials intended to optimize the dispersion of refractive index in order to realize new classes of photonic devices.
Chemical synthesis methods are being developed as a future source of PZT 95/5 powder for neutron generator voltage bar applications. Laboratory-scale powder processes were established to produce PZT billets from these powders. The interactions between calcining temperature, sintering temperature, and pore former content were studied to identify the conditions necessary to produce PZT billets of the desired density and grain size. Several binder systems and pressing aids were evaluated for producing uniform sintered billets with low open porosity. The development of these processes supported the powder synthesis efforts and enabled comparisons between different chem-prep routes.
In this work we have demonstrated the fabrication of two different classes of devices which demonstrate the integration of simple MEMS structures with photonics structures. In the first class of device a suspended, movable Si waveguide was designed and fabricated. This waveguide was designed to be able to be actuated so that it could be brought into close proximity to a ring resonator or similar structure. In the course of this work we also designed a technique to improve the input coupling to the waveguide. While these structures were successfully fabricated, post fabrication and testing involved a significant amount of manipulation of the devices and due to their relatively flimsy nature our structures could not readily survive this extra handling. As a result we redesigned our devices so that instead of moving the waveguides themselves we moved a much smaller optical element into close proximity to the waveguides. Using this approach it was also possible to fabricate a much larger array of actively switched photonic devices: switches, ring resonators, couplers (which act as switches or splitters) and attenuators. We successfully fabricated all these structures and were able to successfully demonstrate splitters, switches and attenuators. The quality of the SiN waveguides fabricated in this work were found to be qualitatively compatible to those made using semiconductor materials.
Solidification and blood flow seemingly have little in common, but each involves a fluid in contact with a deformable solid. In these systems, the solid-fluid interface moves as the solid advects and deforms, often traversing the entire domain of interest. Currently, these problems cannot be simulated without innumerable expensive remeshing steps, mesh manipulations or decoupling the solid and fluid motion. Despite the wealth of progress recently made in mechanics modeling, this glaring inadequacy persists. We propose a new technique that tracks the interface implicitly and circumvents the need for remeshing and remapping the solution onto the new mesh. The solid-fluid boundary is tracked with a level set algorithm that changes the equation type dynamically depending on the phases present. This novel approach to coupled mechanics problems promises to give accurate stresses, displacements and velocities in both phases, simultaneously.
High throughput instruments and analysis techniques are required in order to make good use of the genomic sequences that have recently become available for many species, including humans. These instruments and methods must work with tens of thousands of genes simultaneously, and must be able to identify the small subsets of those genes that are implicated in the observed phenotypes, or, for instance, in responses to therapies. Microarrays represent one such high throughput method, which continue to find increasingly broad application. This project has improved microarray technology in several important areas. First, we developed the hyperspectral scanner, which has discovered and diagnosed numerous flaws in techniques broadly employed by microarray researchers. Second, we used a series of statistically designed experiments to identify and correct errors in our microarray data to dramatically improve the accuracy, precision, and repeatability of the microarray gene expression data. Third, our research developed new informatics techniques to identify genes with significantly different expression levels. Finally, natural language processing techniques were applied to improve our ability to make use of online literature annotating the important genes. In combination, this research has improved the reliability and precision of laboratory methods and instruments, while also enabling substantially faster analysis and discovery.
The pump and actuator systems designed and built in the SUMMiT{trademark} process, Sandia's surface micromachining polysilicon MEMS (Micro-Electro-Mechanical Systems) fabrication technology, on the previous campus executive program LDRD (SAND2002-0704P) with FSU/FAMU (Florida State University/Florida Agricultural and Mechanical University) were characterized in this LDRD. These results demonstrated that the device would pump liquid against the flow resistance of a microfabricated channel, but the devices were determined to be underpowered for reliable pumping. As a result a new set of SUMMiT{trademark} pumps with actuators that generate greater torque will be designed and submitted for fabrication. In this document we will report details of dry actuator/pump assembly testing, wet actuator/pump testing, channel resistance characterization, and new pump/actuator design recommendations.
In a superposition of quantum states, a bit can be in both the states '0' and '1' at the same time. This feature of the quantum bit or qubit has no parallel in classical systems. Currently, quantum computers consisting of 4 to 7 qubits in a 'quantum computing register' have been built. Innovative algorithms suited to quantum computing are now beginning to emerge, applicable to sorting and cryptanalysis, and other applications. A framework for overcoming slightly inaccurate quantum gate interactions and for causing quantum states to survive interactions with surrounding environment is emerging, called quantum error correction. Thus there is the potential for rapid advances in this field. Although quantum information processing can be applied to secure communication links (quantum cryptography) and to crack conventional cryptosystems, the first few computing applications will likely involve a 'quantum computing accelerator' similar to a 'floating point arithmetic accelerator' interfaced to a conventional Von Neumann computer architecture. This research is to develop a roadmap for applying Sandia's capabilities to the solution of some of the problems associated with maintaining quantum information, and with getting data into and out of such a 'quantum computing accelerator'. We propose to focus this work on 'quantum I/O technologies' by applying quantum optics on semiconductor nanostructures to leverage Sandia's expertise in semiconductor microelectronic/photonic fabrication techniques, as well as its expertise in information theory, processing, and algorithms. The work will be guided by understanding of practical requirements of computing and communication architectures. This effort will incorporate ongoing collaboration between 9000, 6000 and 1000 and between junior and senior personnel. Follow-on work to fabricate and evaluate appropriate experimental nano/microstructures will be proposed as a result of this work.
Present methods of air sampling for low concentrations of chemicals like explosives and bioagents involve noisy and power hungry collectors with mechanical parts for moving large volumes of air. However there are biological systems that are capable of detecting very low concentrations of molecules with no mechanical moving parts. An example is the silkworm moth antenna which is a highly branched structure where each of 100 branches contains about 200 sensory 'hairs' which have dimensions of 2 microns wide by 100 microns long. The hairs contain about 3000 pores which is where the gas phase molecules enter the aqueous (lymph) phase for detection. Simulations of diffusion of molecules indicate that this 'forest' of hairs is 'designed' to maximize the extraction of the vapor phase molecules. Since typical molecules lose about 4 decades in diffusion constant upon entering the liquid phase, it is important to allow air diffusion to bring the molecule as close to the 'sensor' as possible. The moth acts on concentrations as low as 1000 molecules per cubic cm. (one part in 1e16). A 3-D collection system of these dimensions could be fabricated by micromachining techniques available at Sandia. This LDRD addresses the issues involved with extracting molecules from air onto micromachined structures and then delivering those molecules to microsensors for detection.
Particle image velocimetry data have been acquired in the far field of the interaction generated by an overexpanded axisymmetric supersonic jet exhausting transversely from a flat plate into a subsonic compressible crossflow. Mean velocity fields were found in the streamwise plane along the flowfield centerline for different values of the crossflow Mach number M{sub {infinity}} and the jet-to-freestream dynamic pressure ratio J. The magnitude of the streamwise velocity deficit and the vertical velocity component both decay with downstream distance and were observed to be greater for larger J while M{sub {infinity}} remained constant. Jet trajectories derived independently using the maxima of each of these two velocity components are not identical, but show increasing jet penetration for larger J. Similarity in the normalized velocity field was found for constant J at two different transonic M{sub {infinity}}, but at two lower M{sub {infinity}} the jet appeared to interact with the wall boundary layer and data did not collapse. The magnitude and width of the peak in the vertical velocity component both increase with J, suggesting that the strength and size of the counter-rotating vortex pair increase and, thus, may have a stronger influence on aerodynamic surfaces despite further jet penetration from the wall.
The sea presents unique possibilities for implementing confidence building measures (CBMs) between India and Pakistan that are currently not available along the contentious land borders surrounding Jammu and Kashmir. This is due to the nature of maritime issues, the common military culture of naval forces, and a less contentious history of maritime interaction between the two nations. Maritime issues of mutual concern provide a strong foundation for more far-reaching future CBMs on land, while addressing pressing security, economic, and humanitarian needs at sea in the near-term. Although Indian and Pakistani maritime forces currently have stronger opportunities to cooperate with one another than their counterparts on land, reliable mechanisms to alleviate tension or promote operational coordination remain non-existent. Therefore, possible maritime CBMs, as well as pragmatic mechanisms to initiate and sustain cooperation, require serious examination. This report reflects the unique joint research undertaking of two retired Senior Naval Officers from both India and Pakistan, sponsored by the Cooperative Monitoring Center of the International Security Center at Sandia National Laboratories. Research focuses on technology as a valuable tool to facilitate confidence building between states having a low level of initial trust. Technical CBMs not only increase transparency, but also provide standardized, scientific means of interacting on politically difficult problems. Admirals Vohra and Ansari introduce technology as a mechanism to facilitate consistent forms of cooperation and initiate discussion in the maritime realm. They present technical CBMs capable of being acted upon as well as high-level political recommendations regarding the following issues: (1) Delimitation of the maritime boundary between India and Pakistan and its relationship to the Sir Creek dispute; (2) Restoration of full shipping links and the security of ports and cargos; (3) Fishing within disputed areas and resolution of issues relating to arrest and repatriation of fishermen from both sides; and (4) Naval and maritime agency interaction and possibilities for cooperation.
Buried landmines are often detected through their chemical signature in the thin air layer, or boundary layer, right above the soil surface by sensors or animals. Environmental processes play a significant role in the available chemical signature. Due to the shallow burial depth of landmines, the weather also influences the release of chemicals from the landmine, transport through the soil to the surface, and degradation processes in the soil. The effect of weather on the landmine chemical signature from a PMN landmine was evaluated with the T2TNT code for three different climates: Kabul, Afghanistan, Ft. Leonard Wood, Missouri, USA, and Napacala, Mozambique. Results for TNT gas-phase and solid-phase concentrations are presented as a function of time of the year.
SNLO is public domain software developed at Sandia Nat. Labs. It is intended to assist in the selection of the best nonlinear crystal for a particular application, and to predict its performance. This paper briefly describes its functions and how to use them. Keywords: optical parametric mixing, optical parametric oscillator, nonlinear crystals, nonlinear optics software.
Various tools and techniques, which were leveraged from the IC industry, were used for the failure analysis and qualification of MEMS. Resistive contrast imaging (RCI) was employed to analyze a wide variety of MEMS technologies. Multi-functional analytical tools are able to operate several samples in parallel and extract structural, chemical and electrical information.
MEMS processes and components are rapidly changing in device design, processing, and, most importantly, application. This paper will discuss the future challenges faced by the MEMS failure analysis as the field of MEMS (fabrication, component design, and applications) grows. Specific areas of concern for the failure analyst will also be discussed.
Microelectromechanical Systems (MEMS) have gained acceptance as viable products for many commercial and government applications. MEMS are currently being used as displays for digital projection systems, sensors for airbag deployment systems, inkjet print head systems, and optical routers. This paper will discuss current and future MEMS applications.
MEMS components by their very nature have different and unique failure mechanisms than their macroscopic counterparts. This paper discusses failure mechanisms observed in various MEMS components and technologies. MEMS devices fabricated using bulk and surface micromachining process technologies are emphasized.
This report summarizes the accomplishments of the Laboratory Directed Research and Development (LDRD) project 26546 at Sandia, during the period FY01 through FY03. The project team visited four DoD depots that support extensive aircraft maintenance in order to understand critical needs for automation, and to identify maintenance processes for potential automation or integration opportunities. From the visits, the team identified technology needs and application issues, as well as non-technical drivers that influence the application of automation in depot maintenance of aircraft. Software tools for automation facility design analysis were developed, improved, extended, and integrated to encompass greater breadth for eventual application as a generalized design tool. The design tools for automated path planning and path generation have been enhanced to incorporate those complex robot systems with redundant joint configurations, which are likely candidate designs for a complex aircraft maintenance facility. A prototype force-controlled actively compliant end-effector was designed and developed based on a parallel kinematic mechanism design. This device was developed for demonstration of surface finishing, one of many in-contact operations performed during aircraft maintenance. This end-effector tool was positioned along the workpiece by a robot manipulator, programmed for operation by the automated planning tools integrated for this project. Together, the hardware and software tools demonstrate many of the technologies required for flexible automation in a maintenance facility.