Fast Z-pinch technology developed on the Z machine at Sandia National Laboratories can produce up to 230 TW of thermal x-ray power for applications in inertial confinement fusion (ICF) and weapons physics experiments. During implosion, these Z-pinches develop Rayleigh-Taylor (R-T) instabilities which are very difficult to diagnose and which functionally diminish the overall pinch quality. The Power-Space-Time (PST) instrument is a newly configured diagnostic for measuring the pinch power as a function of both space and time in a Z-pinch. Placing the diagnostic at 90 degrees from the Z-pinch axis, the PST provides a new capability in collecting experimental data on R-T characteristics for making meaningful comparisons to magneto-hydrodynamic computer models. This paper is a summary of the PST diagnostic design. By slit-imaging the Z-pinch x-ray emissions onto a linear scintillator/fiber-optic array coupled to a streak camera system, the PST can achieve {approximately}100 {micro}m spatial resolution and {approximately}1.3 ns time resolution. Calculations indicate that a 20 {micro}m thick scintillating detection element filtered by 1,000 {angstrom} of Al is theoretically linear in response to Plankian x-ray distributions corresponding to plasma temperatures from 40 eV to 150 eV, By calibrating this detection element to x-ray energies up to 5,000 eV, the PST can provide pinch power as a function of height and time in a Z-pinch for temperatures ranging from {approximately}40 eV to {approximately}400 eV. With these system pm-meters, the PST can provide data for an experimental determination of the R-T mode number, amplitude, and growth rate during the late-time pinch implosion.
A performance evaluation of several computers was necessary, so an evaluation program, or benchmark, was run on each computer to determine maximum possible performance. The program was used to test the Computer Aided Drafting (CAD) ability of each computer by monitoring the speed with which several functions were executed. The main objective of the benchmarking program was to record assembly loading times and image regeneration times and then compile a composite score that could be compared with the same tests on other computers. The three computers that were tested were the Compaq AP550, the SGI 230, and the Hewlett-PackardP750C. The Compaq and SGI computers each had a Pentium III 733mhz processor, while the Hewlett-Packard had a Pentium III 750mhz processor. The size and speed of Random Access Memory (RAM) in each computer varied, as did the type of graphics card. Each computer that was tested was using Windows NT 4.0 and Pro/ENGINEER{trademark} 2000i CAD benchmark software provided by Standard Performance Evaluation Corporation (SPEC). The benchmarking program came with its own assembly, automatically loaded and ran tests on the assembly, then compiled the time each test took to complete. Due to the automation of the tests, any sort of user error affecting test scores was virtually eliminated. After all the tests were completed, scores were then compiled and compared. The Silicon Graphics 230 was by far the overall winner with a composite score of 8.57. The Compaq AP550 was next with a score of 5.19, while the Hewlett-Packard P750C performed dismally, achieving a score of 3.34. Several factors, including motherboard chipset, graphics card, and the size and speed of RAM, were involved in the differing scores of the three machines. Surprisingly the Hewlett-Packard, which had the fastest processor, came back with the lowest score. The above factors most likely contributed to the poor performance of the Hewlett-Packard. Based on the results of the benchmark test, the SGI 230 appears to be the best CAD software solution. The Hewlett-Packard most likely performed poorly due to the fact that it was only running a 100mhz Front Side Bus (FSB), while the SGI machine was running at a 133mhz. The Compaq was using a new type of RAM called RDRAM. While this RAM was at first perceived to be a great performer, various benchmarks, including this one, have found that the computers using RDRAM really only achieve average performance.
Event catalogs for seismic data can become very large. Furthermore, as researchers collect multiple catalogs and reconcile them into a single catalog that is stored in a relational database, the reconciled set becomes even larger. The sheer number of these events makes searching for relevant events to compare with events of interest problematic. Information overload in this form can lead to the data sets being under-utilized and/or used incorrectly or inconsistently. Thus, efforts have been initiated to research techniques and strategies for helping researchers to make better use of large data sets. In this paper, the authors present their efforts to do so in two ways: (1) the Event Search Engine, which is a waveform correlation tool and (2) some content analysis tools, which area combination of custom-built and commercial off-the-shelf tools for accessing, managing, and querying seismic data stored in a relational database. The current Event Search Engine is based on a hierarchical clustering tool known as the dendrogram tool, which is written as a MatSeis graphical user interface. The dendrogram tool allows the user to build dendrogram diagrams for a set of waveforms by controlling phase windowing, down-sampling, filtering, enveloping, and the clustering method (e.g. single linkage, complete linkage, flexible method). It also allows the clustering to be based on two or more stations simultaneously, which is important to bridge gaps in the sparsely recorded event sets anticipated in such a large reconciled event set. Current efforts are focusing on tools to help the researcher winnow the clusters defined using the dendrogram tool down to the minimum optimal identification set. This will become critical as the number of reference events in the reconciled event set continually grows. The dendrogram tool is part of the MatSeis analysis package, which is available on the Nuclear Explosion Monitoring Research and Engineering Program Web Site. As part of the research into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.
The Federal Aviation Administration Airworthiness Assurance NDI Validation Center currently assesses the capability of various non-destructive inspection (NDI) methods used for analyzing aircraft components. The focus of one such exercise is to evaluate the sensitivity of fluorescent liquid penetrant inspection. A baseline procedure using the water-washable fluorescent penetrant method defines a foundation for comparing the brightness of low cycle fatigue cracks in titanium test panels. The analysis of deviations in the baseline procedure will determine an acceptable range of operation for the steps in the inspection process. The data also gives insight into the depth of each crack and which step(s) of the inspection process most affect penetrant sensitivities. A set of six low cycle fatigue cracks produced in 6.35-mm thick Ti-6Al-4V specimens was used to conduct the experiments to produce sensitivity data. The results will document the consistency of the crack readings and compare previous experiments to find the best parameters for water-washable penetrant.
This report provides (1) an overview of all tracer testing conducted in the Culebra Dolomite Member of the Rustler Formation at the Waste Isolation Pilot Plant (WPP) site, (2) a detailed description of the important information about the 1995-96 tracer tests and the current interpretations of the data, and (3) a summary of the knowledge gained to date through tracer testing in the Culebra. Tracer tests have been used to identify transport processes occurring within the Culebra and quantify relevant parameters for use in performance assessment of the WIPP. The data, especially those from the tests performed in 1995-96, provide valuable insight into transport processes within the Culebra. Interpretations of the tracer tests in combination with geologic information, hydraulic-test information, and laboratory studies have resulted in a greatly improved conceptual model of transport processes within the Culebra. At locations where the transmissivity of the Culebra is low (< 4 x 10{sup -6} m{sup 2}/s), we conceptualize the Culebra as a single-porosity medium in which advection occurs largely through the primary porosity of the dolomite matrix. At locations where the transmissivity of the Culebra is high (> 4 x 10{sup -6} m{sup 2}/s), we conceptualize the Culebra as a heterogeneous, layered, fractured medium in which advection occurs largely through fractures and solutes diffuse between fractures and matrix at multiple rates. The variations in diffusion rate can be attributed to both variations in fracture spacing (or the spacing of advective pathways) and matrix heterogeneity. Flow and transport appear to be concentrated in the lower Culebra. At all locations, diffusion is the dominant transport process in the portions of the matrix that tracer does not access by flow.
The Transportation Surety Center, 6300, has been conducting continuing research into and development of information systems for the Configurable Transportation Security and Information Management System (CTSS) project, an Object-Oriented Framework approach that uses Component-Based Software Development to facilitate rapid deployment of new systems while improving software cost containment, development reliability, compatibility, and extensibility. The direction has been to develop a Fleet Management System (FMS) framework using object-oriented technology. The goal for the current development is to provide a software and hardware environment that will demonstrate and support object-oriented development commonly in the FMS Central Command Center and Vehicle domains.
High-quality ultrathin poly(diacetylene) (PDA) films were produced by using a horizontal Langmuir deposition technique. The resultant films exhibit strong friction anisotropy that is correlated with the direction of the polymer backbone structure. Shear forces applied by atomic force microscopy (AFM) or near field scanning optical microscope (NSOM) tips locally induced the blue-to-red chromatic transition in the PDA films.
Interfacial Force Microscopy (IFM) is a scanning probe technique that employs a force-feedback sensor concept. This article discusses a few examples of IFM applications to polymer surfaces. Through these examples, the ability of IFM to obtain quantitative information on interfacial forces on a controllable manner is demonstrated.
This report focuses on Sandia National Laboratories' effort to create high-temperature logging tools for geothermal applications without the need for heat shielding. One of the mechanisms for failure in conventional downhole tools is temperature. They can only survive a limited number of hours in high temperature environments. For the first time since the evolution of integrated circuits, components are now commercially available that are qualified to 225 C with many continuing to work up to 300 C. These components are primarily based on Silicon-On-Insulator (SOI) technology. Sandia has developed and tested a simple data logger based on this technology that operates up to 300 C with a few limiting components operating to only 250 C without thermal protection. An actual well log to 240 C without shielding is discussed. The first prototype high-temperature tool measures pressure and temperature using a wire-line for power and communication. The tool is based around the HT83C51 microcontroller. A brief discussion of the background and status of the High Temperature Instrumentation program at Sandia, objectives, data logger development, and future project plans are given.
Combinatorial Chemistry is a powerful new technology in drug design and molecular recognition. It is a wet-laboratory methodology aimed at ``massively parallel'' screening of chemical compounds for the discovery of compounds that have a certain biological activity. The power of the method comes from the interaction between experimental design and computational modeling. Principles of ``rational'' drug design are used in the construction of combinatorial libraries to speed up the discovery of lead compounds with the desired biological activity. This paper presents algorithms, software development and computational complexity analysis for problems arising in the design of combinatorial libraries for drug discovery. The authors provide exact polynomial time algorithms and intractability results for several Inverse Problems-formulated as (chemical) graph reconstruction problems-related to the design of combinatorial libraries. These are the first rigorous algorithmic results in the literature. The authors also present results provided by the combinatorial chemistry software package OCOTILLO for combinatorial peptide design using real data libraries. The package provides exact solutions for general inverse problems based on shortest-path topological indices. The results are superior both in accuracy and computing time to the best software reports published in the literature. For 5-peptoid design, the computation is rigorously reduced to an exhaustive search of about 2% of the search space; the exact solutions are found in a few minutes.
The design of field emission displays is severely constrained by the universally poor cathodoluminescence (CL) efficiency of most phosphors at low excitation energies. As part of the effort to understand this phenomenon, the authors have measured the time decay of spectrally-resolved, pulsed CL and photoluminescence (PL) in several phosphors activated by rare earth and transition metal impurities, including Y{sub 2}O{sub 3}:Eu, Y{sub 2}SiO{sub 5}:Tb, and Zn{sub 2}SiO{sub 4}:Mn. Activator concentrations ranged from {approximately}0.25 to 10%. The CL decay curves are always non-linear on a log(CL)-linear(time) plot--i.e. they deviate from first order decay kinetics. These deviations are always more pronounced at short times and larger activator concentrations and are largest at low beam energies where the decay rates are noticeably faster. PL decay is always slower than that seen for CL, but these differences disappear after most of the excited species have decayed. They have also measured the dependence of steady state CL efficiency on beam energy. They find that larger activator concentrations accelerate the drop in CL efficiency seen at low beam energies. These effects are largest for the activators which interact more strongly with the host lattice. While activator-activator interactions are known to limit PL and CL efficiency in most phosphors, the present data suggest that a more insidious version of this mechanism is partly responsible for poor CL efficiency at low beam energies. This enhanced concentration quenching is due to the interaction of nearby excited activators. These interactions can lead to non-radiative activator decay, hence lower steady state CL efficiency. Excited state clustering, which may be caused by the large energy loss rate of low energy primary electrons, appears to enhance these interactions. In support of this idea, they find that PL decays obtained at high laser pulse energies resemble the non-linear decays seen in the CL data.
Studies of the influences of temperature, hydrostatic pressure, dc biasing field and frequency on the dielectric constant ({epsilon}{prime}) and loss (tan {delta}) of single crystal [pb (Zn{sub 1/3}Nb{sub 2/3})O{sub 3}]{sub 0.905} (PbTiO{sub 3}){sub 0.095}, or PZN-9.5PT for short, have provided a detailed view of the ferroelectric (FE) response and phase transitions of this technologically important material. While at 1 bar, the crystal exhibits on cooling a cubic-to-tetragonal FE transition followed by a second transition to a rhombohedral phase, pressure induces a FE-to-relaxer crossover, the relaxer phase becoming the ground state at pressures {ge}5 kbar. Analogy with earlier results suggests that this crossover is a common feature of compositionally-disordered soft mode ferroelectrics and can be understood in terms of a decrease in the correlation length among polar domains with increasing pressure. Application of a dc biasing electric field at 1 bar strengthens FE correlations, and can at high pressure re-stabilize the FE response. The pressure-temperature-electric field phase diagram was established. In the absence of dc bias the tetragonal phase vanishes at high pressure, the crystal exhibiting classic relaxor behavior. The dynamics of dipolar motion and the strong deviation from Curie-Weiss behavior of the susceptibility in the high temperature cubic phase are discussed.