Tonopah Test Range (TTR) in Nevada and Kauai Test Facility (KTF) in Hawaii are government-owned, contractor-operated facilities operated by Sandia Corporation, a subsidiary of Lockheed Martin Corporation. The U.S. Department of Energy (DOE), National Nuclear Security Administration (NNSA), through the Sandia Site Office (SSO), in Albuquerque, NM, manages TTR and KTF's operations. Sandia Corporation conducts operations at TTR in support of DOE/NNSA's Weapons Ordnance Program and has operated the site since 1957. Westinghouse Government Services subcontracts to Sandia Corporation in administering most of the environmental programs at TTR. Sandia Corporation operates KTF as a rocket preparation launching and tracking facility. This Annual Site Environmental Report (ASER) summarizes data and the compliance status of the environmental protection and monitoring program at TTR and KTF through Calendar Year (CY) 2004. The compliance status of environmental regulations applicable at these sites include state and federal regulations governing air emissions, wastewater effluent, waste management, terrestrial surveillance, and Environmental Restoration (ER) cleanup activities. Sandia Corporation is responsible only for those environmental program activities related to its operations. The DOE/NNSA, Nevada Site Office (NSO) retains responsibility for the cleanup and management of ER TTR sites. Currently, there are no ER Sites at KTF. Environmental monitoring and surveillance programs are required by DOE Order 450.1, Environmental Protection Program (DOE 2005) and DOE Order 231.1A, Environment, Safety, and Health Reporting (DOE 2004b).
When two electrodes are in close proximity in a dielectric liquid, application of a voltage pulse can produce a spark discharge between them, resulting in a small amount of material removal from both electrodes. Pulsed application of the voltage at discharge energies in the range of micro-Joules results in the continuous material removal process known as micro-electro-discharge machining (micro-EDM). Spark erosion by micro-EDM provides significant opportunities for producing small features and micro-components such as nozzle holes, slots, shafts and gears in virtually any conductive material. If the speed and precision of micro-EDM processes can be significantly enhanced, then they have the potential to be used for a wide variety of micro-machining applications including fabrication of microelectromechanical system (MEMS) components. Toward this end, a better understanding of the impacts the various machining parameters have on material removal has been established through a single discharge study of micro-EDM and a parametric study of small hole making by micro-EDM. The main avenues for improving the speed and efficiency of the micro-EDM process are in the areas of more controlled pulse generation in the power supply and more controlled positioning of the tool electrode during the machining process. Further investigation of the micro-EDM process in three dimensions leads to important design rules, specifically the smallest feature size attainable by the process.
Finned bodies of revolution firing lateral jets in flight may experience lower spin rates than predicted. This reduction in spin rate is a result of vortices generated by the interaction between the lateral jets and freestream air flowing past the body. The vortices change the pressure distribution on the fins, inducing a counter torque that opposes the desired spin. Wind tunnel data measuring roll torque and fin pressures were collected for a full-scale model at varying angle of attack, roll angle, airspeed, and jet strength. The current analysis builds upon previously written code that computes torque by integrating pressure over the fin surfaces at 0{sup o} angle of attack. The code was modified to investigate the behavior of counter torque at different angles of attack and roll angles as a function of J, the ratio of jet dynamic pressure to freestream dynamic pressure. Numerical error analysis was applied to all data to assist with interpretation of results. Results show that agreement between balance and fin pressure counter torque at 0{sup o} angle of attack was not as close as previously believed. Counter torque at 4{sup o} angle of attack was higher than at 0{sup o}, and agreement between balance and fin pressure counter torque was closer. Plots of differential fin pressure coefficient revealed a region of high pressure at the leading edge and an area of low pressure over the center and aft regions of the tapped surface. Large differences in the counter-torque coefficient were found between various freestream dynamic pressures, especially at Mach 0.95 and 1.1. Roll angle had significant effect only for cases at angle of attack, where it caused counter torque to change unpredictably.
Energy planning represents an investment-decision problem. Investors commonly evaluate such problems using portfolio theory to manage risk and maximize portfolio performance under a variety of unpredictable economic outcomes. Energy planners need to similarly abandon their reliance on traditional, ''least-cost'' stand-alone technology cost estimates and instead evaluate conventional and renewable energy sources on the basis of their portfolio cost--their cost contribution relative to their risk contribution to a mix of generating assets. This report describes essential portfolio-theory ideas and discusses their application in the Western US region. The memo illustrates how electricity-generating mixes can benefit from additional shares of geothermal and other renewables. Compared to fossil-dominated mixes, efficient portfolios reduce generating cost while including greater renewables shares in the mix. This enhances energy security. Though counter-intuitive, the idea that adding more costly geothermal can actually reduce portfolio-generating cost is consistent with basic finance theory. An important implication is that in dynamic and uncertain environments, the relative value of generating technologies must be determined not by evaluating alternative resources, but by evaluating alternative resource portfolios. The optimal results for the Western US Region indicate that compared to the EIA target mixes, there exist generating mixes with larger geothermal shares at equal-or-lower expected cost and risk.
IFP V4.0 is the fourth generation of an extraordinarily powerful and flexible image formation processor for spotlight mode synthetic aperture radar. It has been successfully utilized in processing phase histories from numerous radars and has been instrumental in the development of many new capabilities for spotlight mode SAR. This document provides a brief history of the development of IFP, a full exposition of the signal processing steps involved, and a short user's manual for the software implementing this latest iteration.
A linear structure is excited at multiple points with a stationary normal random process. The response of the structure is measured at multiple outputs. If the auto spectral densities of the inputs are specified, the phase relationships between the inputs are derived that will minimize or maximize the trace of the auto spectral density matrix of the outputs. If the autospectral densities of the outputs are specified, the phase relationships between the outputs that will minimize or maximize the trace of the input auto spectral density matrix are derived. It is shown that other phase relationships and ordinary coherence less than one will result in a trace intermediate between these extremes. Least favorable response and some classes of critical response are special cases of the development. It is shown that the derivation for stationary random waveforms can also be applied to nonstationary random, transients, and deterministic waveforms.
In hostile ad hoc wireless communication environments, such as battlefield networks, end-node authentication is critical. In a wired infrastructure, this authentication service is typically facilitated by a centrally-located ''authentication certificate generator'' such as a Certificate Authority (CA) server. This centralized approach is ill-suited to meet the needs of mobile ad hoc networks, such as those required by military systems, because of the unpredictable connectivity and dynamic routing. There is a need for a secure and robust approach to mobile node authentication. Current mechanisms either assign a pre-shared key (shared by all participating parties) or require that each node retain a collection of individual keys that are used to communicate with other individual nodes. Both of these approaches have scalability issues and allow a single compromised node to jeopardize the entire mobile node community. In this report, we propose replacing the centralized CA with a distributed CA whose responsibilities are shared between a set of select network nodes. To that end, we develop a protocol that relies on threshold cryptography to perform the fundamental CA duties in a distributed fashion. The protocol is meticulously defined and is implemented it in a series of detailed models. Using these models, mobile wireless scenarios were created on a communication simulator to test the protocol in an operational environment and to gather statistics on its scalability and performance.
SAR phase history data represents a polar array in the Fourier space of a scene being imaged. Polar Format processing is about reformatting the collected SAR data to a Cartesian data location array for efficient processing and image formation. In a real-time system, this reformatting or ''re-gridding'' operation is the most processing intensive, consuming the majority of the processing time; it also is a source of error in the final image. Therefore, any effort to reduce processing time while not degrading image quality is valued. What is proposed in this document is a new way of implementing real-time polar-format processing through a variation on the traditional interpolation/2-D Fast Fourier Transform (FFT) algorithm. The proposed change is based upon the frequency scaling property of the Fourier Transform, which allows a post azimuth FFT interpolation. A post azimuth processing interpolation provides overall benefits to image quality and potentially more efficient implementation of the polar format image formation process.
A series of three pressurized sulfuric acid decomposition tests were performed to (1) obtain data on the fraction of sulfuric acid catalytically converted to sulfur dioxide, oxygen, and water as a function of temperature and pressure, (2) demonstrate real-time measurements of acid conversion for use as process control, (3) obtain multiple measurements of conversion as a function of temperature within a single experiment, and (4) assess rapid quenching to minimize corrosion of metallic components by undecomposed acid. All four of these objectives were successfully accomplished. This report documents the completion of the NHI milestone on high pressure H{sub 2}SO{sub 4} decomposition tests for the Sulfur-Iodine (SI) thermochemical cycle project. All heated sections of the apparatus, (i.e. the boiler, decomposer, and condenser) were fabricated from Hastelloy C276. A ceramic acid injection tube and a ceramic-sheathed thermocouple were used to minimize corrosion of hot liquid acid on the boiler surfaces. Negligible fracturing of the platinum on zirconia catalyst was observed in the high temperature decomposer. Temperature measurements at the exit of the decomposer and at the entry of the condenser indicated that the hot acid vapors were rapidly quenched from about 400 C to less than 20 C within a 14 cm length of the flow path. Real-time gas flow rate measurements of the decomposition products provided a direct measurement of acid conversion. Pressure in the apparatus was preset by a pressure-relief valve that worked well at controlling the system pressure. However, these valves sometimes underwent abrupt transitions that resulted in rapidly varying gas flow rates with concomitant variations in the acid conversion fraction.
Semantic graphs offer one promising avenue for intelligence analysis in homeland security. They provide a mechanism for describing a wide variety of relationships between entities of potential interest. The vertices are nouns of various types, e.g. people, organizations, events, etc. Edges in the graph represent different types of relationships between entities, e.g. 'is friends with', 'belongs-to', etc. Semantic graphs offer a number of potential advantages as a knowledge representation system. They allow information of different kinds, and collected in differing ways, to be combined in a seamless manner. A semantic graph is a very compressed representation of some of relationship information. It has been reported that the semantic graph can be two orders of magnitude smaller than the processed intelligence data. This allows for much larger portions of the data universe to be resident in computer memory. Many intelligence queries that are relevant to the terrorist threat are naturally expressed in the language of semantic graphs. One example is the search for 'interesting' relationships between two individuals or between an individual and an event, which can be phrased as a search for short paths in the graph. Another example is the search for an analyst-specified threat pattern, which can be cast as an instance of subgraph isomorphism. It is important to note than many kinds of analysis are not relationship based, so these are not good candidates for semantic graphs. Thus, a semantic graph should always be used in conjunction with traditional knowledge representation and interface methods. Operations that involve looking for chains of relationships (e.g. friend of a friend) are not efficiently executable in a traditional relational database. However, the semantic graph can be thought of as a pre-join of the database, and it is ideally suited for these kinds of operations. Researchers at Sandia National Laboratories are working to facilitate semantic graph analysis. Since intelligence datasets can be extremely large, the focus of this work is on the use of parallel computers. We have been working to develop scalable parallel algorithms that will be at the core of a semantic graph analysis infrastructure. Our work has involved two different thrusts, corresponding to two different computer architectures. The first architecture of interest is distributed memory, message passing computers. These machines are ubiquitous and affordable, but they are challenging targets for graph algorithms. Much of our distributed-memory work to date has been collaborative with researchers at Lawrence Livermore National Laboratory and has focused on finding short paths on distributed memory parallel machines. Our implementation on 32K processors of BlueGene/Light finds shortest paths between two specified vertices in just over a second for random graphs with 4 billion vertices.
The deployment of the Joint Technical Operations Team (JTOT) is evolving toward a lean and mobile response team. As a result, opportunities to support more rapid mobilization are being investigated. This study investigates three specific opportunities including: (1) the potential of using standard firefighting equipment to support deployment of the aqueous foam concentrate (AFC-380); (2) determining the feasibility and needs for regional staging of equipment to reduce the inventory currently mobilized during a JTOT response; and (3) determining the feasibility and needs for development of the next generation AFC-380 to reduce the volume of foam concentrate required for a response. This study supports the need to ensure that requirements for alternative deployment schemes are understood and in place to support improved response activities.
The Advanced Concepts Group of Sandia National Laboratories hosted a workshop, ''FOILFest: Community Enabled Security'', on July 18-21, 2005, in Albuquerque, NM. This was a far-reaching look into the future of physical protection consisting of a series of structured brainstorming sessions focused on preventing and foiling attacks on public places and soft targets such as airports, shopping malls, hotels, and public events. These facilities are difficult to protect using traditional security devices since they could easily be pushed out of business through the addition of arduous and expensive security measures. The idea behind this Fest was to explore how the public, which is vital to the function of these institutions, can be leveraged as part of a physical protection system. The workshop considered procedures, space design, and approaches for building community through technology. The workshop explored ways to make the ''good guys'' in public places feel safe and be vigilant while making potential perpetrators of harm feel exposed and convinced that they will not succeed. Participants in the Fest included operators of public places, social scientists, technology experts, representatives of government agencies including DHS and the intelligence community, writers and media experts. Many innovative ideas were explored during the fest with most of the time spent on airports, including consideration of the local airport, the Albuquerque Sunport. Some provocative ideas included: (1) sniffers installed in passage areas like revolving door, escalators, (2) a ''jumbotron'' showing current camera shots in the public space, (3) transparent portal screeners allowing viewing of the screening, (4) a layered open/funnel/open/funnel design where open spaces are used to encourage a sense of ''communitas'' and take advantage of citizen ''sensing'' and funnels are technological tunnels of sensors (the tunnels of truth), (5) curved benches with blast proof walls or backs, (6) making it easy for the public to report, even if not sure/''non-event'' (e.g. ''I'm uncomfortable'') and processing those reports in aggregate not individually, (7) transforming the resident working population into a part-time undercover security/sensor force through more innovative training and (8) adding ambassadors/security that engage in unexpected conversation with the public. The group recommended that we take actions to pursue the following ideas next: (a) A concept for a mobile sensor transport (JMP); (b) Conduct a follow-on workshop; (c) Conduct social experiments/activities to see how people would react to the concepts related to community and security; (d) Explore further aesthetically pleasing, blast-resistance seating areas; and (e) The Art of Freedom (an educational, multi-media campaign).
The RoboHound{trademark} Project was a three-year, multiphase project at Sandia National Laboratories to build and refine a working prototype trace explosive detection system as a tool for a commercial robot. The RoboHound system was envisioned to be a tool for emergency responders to test suspicious items (i.e., packages or vehicles) for explosives while maintaining a safe distance. The project investigated combining Sandia's expertise in trace explosives detection with a wheeled robotic platform that could be programmed to interrogate suspicious items remotely for the presence of explosives. All of the RoboHound field tests were successful, especially with regards to the ability to collect and detect trace samples of RDX. The project has gone from remote sampling with human intervention to a fully automatic system that requires no human intervention until the robot returns from a sortie. A proposal is being made for additional work leading towards commercialization.
It is commonly believed that scale-free networks are robust to massive numbers of random node deletions. For example, Cohen et al. in (1) study scale-free networks including some which approximate the measured degree distribution of the Internet. Their results suggest that if each node in this network failed independently with probability 0.99, most of the remaining nodes would still be connected in a giant component. In this paper, we show that a large and important subclass of scale-free networks are not robust to massive numbers of random node deletions. In particular, we study scale-free networks which have minimum node degree of 1 and a power-law degree distribution beginning with nodes of degree 1 (power-law networks). We show that, in a power-law network approximating the Internet's reported distribution, when the probability of deletion of each node is 0.5 only about 25% of the surviving nodes in the network remain connected in a giant component, and the giant component does not persist beyond a critical failure rate of 0.9. The new result is partially due to improved analytical accommodation of the large number of degree-0 nodes that result after node deletions. Our results apply to power-law networks with a wide range of power-law exponents, including Internet-like networks. We give both analytical and empirical evidence that such networks are not generally robust to massive random node deletions.
This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Advancements in Sensing and Perception using Structured Lighting Techniques''. There is an ever-increasing need for robust, autonomous ground vehicles for counterterrorism and defense missions. Although there has been nearly 30 years of government-sponsored research, it is undisputed that significant advancements in sensing and perception are necessary. We developed an innovative, advanced sensing technology for national security missions serving the Department of Energy, the Department of Defense, and other government agencies. The principal goal of this project was to develop an eye-safe, robust, low-cost, lightweight, 3D structured lighting sensor for use in broad daylight outdoor applications. The market for this technology is wide open due to the unavailability of such a sensor. Currently available laser scanners are slow, bulky and heavy, expensive, fragile, short-range, sensitive to vibration (highly problematic for moving platforms), and unreliable for outdoor use in bright sunlight conditions. Eye-safety issues are a primary concern for currently available laser-based sensors. Passive, stereo-imaging sensors are available for 3D sensing but suffer from several limitations : computationally intensive, require a lighted environment (natural or man-made light source), and don't work for many scenes or regions lacking texture or with ambiguous texture. Our approach leveraged from the advanced capabilities of modern CCD camera technology and Center 6600's expertise in 3D world modeling, mapping, and analysis, using structured lighting. We have a diverse customer base for indoor mapping applications and this research extends our current technology's lifecycle and opens a new market base for outdoor 3D mapping. Applications include precision mapping, autonomous navigation, dexterous manipulation, surveillance and reconnaissance, part inspection, geometric modeling, laser-based 3D volumetric imaging, simultaneous localization and mapping (SLAM), aiding first responders, and supporting soldiers with helmet-mounted LADAR for 3D mapping in urban-environment scenarios. The technology developed in this LDRD overcomes the limitations of current laser-based 3D sensors and contributes to the realization of intelligent machine systems reducing manpower need.