Sandia National Laboratories performed vibration and shock testing on a Savannah River Hydride Transport Vessel (HTV) which is used for bulk shipments of tritium. This testing is required to qualify the HTV for transport in the H1616 shipping container. The main requirement for shipment in the H1616 is that the contents (in this case the HTV) have a tritium leak rate of less than 1x10{sup {minus}7} cc/sec after being subjected to shock and vibration normally incident to transport. Helium leak tests performed before and after the vibration and shock testing showed that the HTV remained leaktight under the specified conditions. This report documents the tests performed and the test results.
Experiments were performed at SATURN, a high current z-pinch, to explore the feasibility of creating a hohlraum by imploding a tungsten wire array onto a low-density foam. Emission measurements in the 200--280 eV energy band were consistent with a 110--135 eV Planckian before the target shock heated, or stagnated, on-axis. Peak pinch radiation temperatures of nominally 160 eV were obtained. Measured early time x-ray emission histories and temperature estimates agree well with modeled performance in the 200--280 eV band using a 2D radiation magneto-hydrodynamics code. However, significant differences are observed in comparisons of the x-ray images and 2D simulations.
Field Jr., R.V.; Grigoriadis, K.M.; Bergman, L.A.; Skelton, R.E.
Random variations, whether they occur in the input signal or the system parameters, are phenomena that occur in nearly all engineering systems of interest. As a result, nondeterministic modeling techniques must somehow account for these variations to ensure validity of the solution. As might be expected, this is a difficult proposition and the focus of many current research efforts. Controlling seismically excited structures is one pertinent application of nondeterministic analysis and is the subject of the work presented herein. This overview paper is organized into two sections. First, techniques to assess system reliability, in a context familiar to civil engineers, are discussed. Second, and as a consequence of the first, active control methods that ensure good performance in this random environment are presented. It is the hope of the authors that these discussions will ignite further interest in the area of reliability assessment and design of controlled civil engineering structures.
Interpretation of compression stress-relaxation (CSR) experiments for elastomers in air is complicated by (1) the presence of both physical and chemical relaxation and (2) anomalous diffusion-limited oxidation (DLO) effects. For a butyl material, the authors first use shear relaxation data to indicate that physical relaxation effects are negligible during typical high temperature CSR experiments. They then show that experiments on standard CSR samples ({approximately}15 mm diameter when compressed) lead to complex non-Arrhenius behavior. By combining reaction kinetics based on the historic basic autoxidation scheme with a diffusion equation appropriate to disk-shaped samples, they derive a theoretical DLO model appropriate to CSR experiments. Using oxygen consumption and permeation rate measurements, the theory shows that important DLO effects are responsible for the observed non-Arrhenius behavior. To minimize DLO effects, they introduce a new CSR methodology based on the use of numerous small disk samples strained in parallel. Results from these parallel, minidisk experiments lead to Arrhenius behavior with an activation energy consistent with values commonly observed for elastomers, allowing more confident extrapolated predictions. In addition, excellent correlation is noted between the CSR force decay and the oxygen consumption rate, consistent with the expectation that oxidative scission processes dominate the CSR results.
The Optical Assembly (OA) for the Multispectral Thermal Imager (MTI) program has been fabricated, assembled, and successfully tested for its performance. It represents a major milestone achieved towards completion of this earth observing E-O imaging sensor that is to be operated in low earth orbit. Along with its wide-field-of-view (WFOV), 1.82{degree} along-track and 1.38{degree} cross-track, and comprehensive on-board calibration system, the pushbroom imaging sensor employs a single mechanically cooled focal plane with 15 spectral bands covering a wavelength range from 0.45 to 10.7 {micro}m. The OA has an off-axis three-mirror anastigmatic (TMA) telescope with a 36-cm unobscured clear aperture. The two key performance criteria, 80% enpixeled energy in the visible and radiometric stability of 1% 1{sigma} in the visible/near-infrared (VNIR) and short wavelength infrared (SWIR), of 1.45% 1{sigma} in the medium wavelength infrared (MWIR), and of 0.53% 1{sigma} long wavelength infrared (LWIR), as well as its low weight (less than 49 kg) and volume constraint (89 cm x 44 cm x 127 cm) drive the overall design configuration of the OA and fabrication requirements.
With the increased use of public key cryptography, faster modular multiplication has become an important cryptographic issue. Almost all public key cryptography, including most elliptic curve systems, use modular multiplication. Modular multiplication, particularly for the large public key modulii, is very slow. Increasing the speed of modular multiplication is almost synonymous with increasing the speed of public key cryptography. There are two parts to modular multiplication: multiplication and modular reduction. Though there are fast methods for multiplying and fast methods for doing modular reduction, they do not mix well. Most fast techniques require integers to be in a special form. These special forms are not related and converting from one form to another is more costly than using the standard techniques. To this date it has been better to use the fast modular reduction technique coupled with standard multiplication. Standard modular reduction is much more costly than standard multiplication. Fast modular reduction (Montgomery`s method) reduces the reduction cost to approximately that of a standard multiply. Of the fast multiplication techniques, the redundant number system technique (RNS) is one of the most popular. It is simple, converting a large convolution (multiply) into many smaller independent ones. Not only do redundant number systems increase speed, but the independent parts allow for parallelization. RNS form implies working modulo another constant. Depending on the relationship between these two constants; reduction OR division may be possible, but not both. This paper describes a new technique using ideas from both Montgomery`s method and RNS. It avoids the formula problem and allows fast reduction and multiplication. Since RNS form is used throughout, it also allows the entire process to be parallelized.
The authors conducted perforation experiments with 4340 Rc 38 and T-250 maraging steel, long rod projectiles and HY-100 steel target plates at striking velocities between 80 and 370 m/s. Flat-end rod projectiles with lengths of 89 and 282 mm were machined to nominally 30-mm-diameter so they could be launched from a 30-mm-powder gun without sabots. The target plates were rigidly clamped at a 305-mm-diameter and had nominal thicknesses of 5.3 and 10.5 mm. Four sets of experiments were conducted to show the effects of rod length and plate thickness on the measured ballistic limit and residual velocities. In addition to measuring striking and residual projectile velocities, they obtained framing camera data on the back surfaces of several plates that showed clearly the plate deformation and plug ejection process. They also present a beam model that exhibits qualitatively the experimentally observed mechanisms.
This paper introduces a new configuration of parallel manipulator call the Rotopod which is constructed from all revolute type joints. The Rotopod consists of two platforms connected by six legs and exhibits six Cartesian degrees of freedom. The Rotopod is initially compared with other all revolute joint parallel manipulators to show its similarities and differences. The inverse kinematics for this mechanism are developed and used to analyze the accessible workspace of the mechanism. Optimization is performed to determine the Rotopod design configurations which maximum the accessible workspace based on desirable functional constraints.
This paper investigates a new aspect of fine motion planning for the micro domain. As parts approach 1--10 {micro}m or less in outside dimensions, interactive forces such as van der Waals and electrostatic forces become major factors which greatly change the assembly sequence and path plans. It has been experimentally shown that assembly plans in the micro domain are not reversible, motions required to pick up a part are not the reverse of motions required to release a part. This paper develops the mathematics required to determine the goal regions for pick up, holding, and release of a micro-sphere being handled by a rectangular tool.
Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system for planning, rehearsing and training assault operations.
This paper presents an analysis of the thermal effects on radioactive (RAM) transportation packages with a fire in an adjacent compartment. An assumption for this analysis is that the adjacent hold fire is some sort of engine room fire. Computational fluid dynamics (CFD) analysis tools were used to perform the analysis in order to include convective heat transfer effects. The analysis results were compared to experimental data gathered in a series of tests on tile US Coast Guard ship Mayo Lykes located at Mobile, Alabama.
The Agile Manufacturing Prototyping System (AMPS) is being integrated at Sandia National Laboratories. AMPS consists of state of the industry flexible manufacturing hardware and software enhanced with Sandia advancements in sensor and model based control; automated programming, assembly and task planning; flexible fixturing; and automated reconfiguration technology. AMPS is focused on the agile production of complex electromechanical parts. It currently includes 7 robots (4 Adept One, 2 Adept 505, 1 Staubli RX90), conveyance equipment, and a collection of process equipment to form a flexible production line capable of assembling a wide range of electromechanical products. This system became operational in September 1995. Additional smart manufacturing processes will be integrated in the future. An automated spray cleaning workcell capable of handling alcohol and similar solvents was added in 1996 as well as parts cleaning and encapsulation equipment, automated deburring, and automated vision inspection stations. Plans for 1997 and out years include adding manufacturing processes for the rapid prototyping of electronic components such as soldering, paste dispensing and pick-and-place hardware.
The direct connection of information, captured in forms such as CAD databases, to the factory floor is enabling a revolution in manufacturing. Rapid response to very dynamic market conditions is becoming the norm rather than the exception. In order to provide economical rapid fabrication of small numbers of variable products, one must design with manufacturing constraints in mind. In addition, flexible manufacturing systems must be programmed automatically to reduce the time for product change over in the factory and eliminate human errors. Sensor based machine control is needed to adapt idealized, model based machine programs to uncontrolled variables such as the condition of raw materials and fabrication tolerances.
The Internet and the applications it supports are revolutionizing the way people work together. This paper presents four case studies in engineering collaboration that new Internet technologies have made possible. These cases include assembly design and analysis, simulation, intelligent machine system control, and systems integration. From these cases, general themes emerge that can guide the way people will work together in the coming decade.
Data authentication as provided by digital signatures is a well known technique for verifying data sent via untrusted network links. Recent work has extended digital signatures to allow jointly generated signatures using threshold techniques. In addition, new proactive mechanisms have been developed to protect the joint private key over long periods of time and to allow each of the parties involved to verify the actions of the other parties. In this paper, the authors describe an application in which proactive digital signature techniques are a particularly valuable tool. They describe the proactive DSA protocol and discuss the underlying software tools that they found valuable in developing an implementation. Finally, the authors briefly describe the protocol and note difficulties they experienced and continue to experience in implementing this complex cryptographic protocol.
This paper presents a graph based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level of effort for the attacker, various graph algorithms such as shortest path algorithms can identify the attack paths with the highest probability of success.
This report provides a review of the Palisades submittal to the Nuclear Regulatory Commission requesting endorsement of their accumulated neutron fluence estimates based on a least squares adjustment methodology. This review highlights some minor issues in the applied methodology and provides some recommendations for future work. The overall conclusion is that the Palisades fluence estimation methodology provides a reasonable approach to a {open_quotes}best estimate{close_quotes} of the accumulated pressure vessel neutron fluence and is consistent with the state-of-the-art analysis as detailed in community consensus ASTM standards.
The original DAMP (DAta Manipulation Program) was written by Mark Hedemann of Sandia National Laboratories and used the CA-DISSPLA{trademark} (available from Computer Associates International, Inc., Garden City, NY) graphics package as its engine. It was used to plot, modify, and otherwise manipulate the one-dimensional data waveforms (data vs. time) from a wide variety of accelerators. With the waning of CA-DISSPLA and the increasing popularity of Unix{reg_sign}-based workstations, a replacement was needed. This package uses the IDL{reg_sign} software, available from Research Systems Incorporated in Boulder, Colorado, as the engine, and creates a set of widgets to manipulate the data in a manner similar to the original DAMP and earlier versions of xdamp. IDL is currently supported on a wide variety of Unix platforms such as IBM{reg_sign} workstations, Hewlett Packard workstations, SUN{reg_sign} workstations, Microsoft{reg_sign} Windows{trademark} computers, Macintosh{reg_sign} computers and Digital Equipment Corporation VMS{reg_sign} and Alpha{reg_sign} systems. Thus, xdamp is portable across many platforms. The author has verified operation, albeit with some minor IDL bugs, on personal computers using Windows 95 and Windows NT; IBM Unix platforms; and DEC alpha and VMS systems; HP 9000/700 series workstations; and Macintosh computers, both regular and PowerPC{trademark} versions. Version 3 adds the capability to manipulate images to the original xdamp capabilities.
In this work the authors report results of narrowband amplifiers designed for milliwatt and submilliwatt power consumption using JFET and pseudomorphic high electron mobility transistors (PHEMT) GaAs-based technologies. Enhancement-mode JFETs were used to design both a hybrid amplifier with off-chip matching as well as a monolithic microwave integrated circuit (MMIC) with on-chip matching. The hybrid amplifier achieved 8--10 dB of gain at 2.4 GHz and 1 mW. The MMIC achieved 10 dB of gain at 2.4 GHz and 2 mW. Submilliwatt circuits were also explored by using 0.25 {micro}m PHEMTs. 25 {micro}W power levels were achieved with 5 dB of gain for a 215 MHz hybrid amplifier. These results significantly reduce power consumption levels achievable with the JFETs or prior MESFET, heterostructure field effect transistor (HFET), or Si bipolar results from other laboratories.
The performance of vertical cavity surface emitting lasers (VCSELs) has improved greatly in recent years. Much of this improvement can be attributed to the use of native oxide layers within the laser structure, providing both electrical and optical transverse confinement. Understanding this optical confinement will be vital for the future realization of yet smaller lasers with ultralow threshold currents. Here the authors report the spectral and modal properties of small (0.5 {micro}m to 5 {micro}m current aperture) VCSELs and identify Joule heating as a dominant effect in the resonator properties of the smallest lasers.
The authors review the use of in-situ normal incidence reflectance, combined with a virtual interface model, to monitor and control the growth of complex compound semiconductor devices. The technique is being used routinely on both commercial and research metal-organic chemical vapor deposition (MOCVD) reactors and in molecular beam epitaxy (MBE) to measure growth rates and high temperature optical constants of compound semiconductor alloys. The virtual interface approach allows one to extract the calibration information in an automated way without having to estimate the thickness or optical constants of the alloy, and without having to model underlying thin film layers. The method has been used in a variety of data analysis applications collectively referred to as ADVISOR (Analysis of Deposition using Virtual Interfaces and Spectroscopic Optical Reflectance). This very simple and robust monitor and ADVISOR method provides one with the equivalent of a real-time reflection high energy electron reflectance (RHEED) tool for both MBE and MOCVD applications.
The phase-out of the ozone-depleting solvents has forced industry to look to solvents such as alcohol, terpenes and other flammable solvents to perform the critical cleaning processes. These solvents are not as efficient as the ozone-depleting solvents in terms of soil loading, cleaning time and drying when used in standard cleaning processes such as manual sprays or ultrasonic baths. They also require special equipment designs to meet part cleaning specifications and operator safety requirements. This paper describes a cleaning system that incorporates the automated spraying of flammable solvents to effectively perform precision cleaning processes. Key to the project`s success was the development of software that controls the robotic system and automatically generates robotic cleaning paths from three dimensional CAD models of the items to be cleaned.
Deep high-aspect ratio Si etching (HARSE) has shown potential application for passive self-alignment of dissimilar materials and devices on Si carriers or waferboards. The Si can be etched to specific depths and; lateral dimensions to accurately place or locate discrete components (i.e lasers, photodetectors, and fiber optics) on a Si carrier. It is critical to develop processes which maintain the dimensions of the mask, yield highly anisotropic profiles for deep features, and maintain the anisotropy at the base of the etched feature. In this paper the authors report process conditions for HARSE which yield etch rates exceeding 3 {micro}m/min and well controlled, highly anisotropic etch profiles. Examples for potential application to advanced packaging technologies will also be shown.
Moving commercial cargo across the US-Mexico border is currently a complex, paper-based, error-prone process that incurs expensive inspections and delays at several ports of entry in the Southwestern US. Improved information handling will dramatically reduce border dwell time, variation in delivery time, and inventories, and will give better control of the shipment process. The Border Trade Facilitation System (BTFS) is an agent-based collaborative work environment that assists geographically distributed commercial and government users with transshipment of goods across the US-Mexico border. Software agents mediate the creation, validation and secure sharing of shipment information and regulatory documentation over the Internet, using the World Wide Web to interface with human actors. Agents are organized into Agencies. Each agency represents a commercial or government agency. Agents perform four specific functions on behalf of their user organizations: (1) agents with domain knowledge elicit commercial and regulatory information from human specialists through forms presented via web browsers; (2) agents mediate information from forms with diverse otologies, copying invariant data from one form to another thereby eliminating the need for duplicate data entry; (3) cohorts of distributed agents coordinate the work flow among the various information providers and they monitor overall progress of the documentation and the location of the shipment to ensure that all regulatory requirements are met prior to arrival at the border; (4) agents provide status information to human actors and attempt to influence them when problems are predicted.