This paper introduces approaches that combine micro/nanomolding, or nanoimprinting, techniques with proximity optical phase mask lithographic methods to form three dimensional (3D) nanostructures in thick, transparent layers of photopolymers. The results demonstrate three strategies of this type, where molded relief structures in these photopolymers represent (i) fine (<1 μm) features that serve as the phase masks for their own exposure, (ii) coarse features (>1 μm) that are used with phase masks to provide access to large structure dimensions, and (iii) fine structures that are used together phase masks to achieve large, multilevel phase modulations. Several examples are provided, together with optical modeling of the fabrication process and the transmission properties of certain of the fabricated structures. Lastly, these approaches provide capabilities in 3D fabrication that complement those of other techniques, with potential applications in photonics, microfluidics, drug delivery and other areas.
Since sensitivity to contamination is one of the verities of solid state joining, there is a need for assessing contamination of the part(s) to be joined, preferably nondestructively while it can be remedied. As the surfaces that are joined in pinch welds are inaccessible and thus provide a greater challenge, most of the discussion is of the search for the origin and effect of contamination on pinch welding and ways to detect and mitigate it. An example of contamination and the investigation and remediation of such a system is presented. Suggestions are made for techniques for nondestructive evaluation of contamination of surfaces for other solid state welds as well as for pinch welds. Surfaces that have good visual access are amenable to inspection by diffuse reflection infrared Fourier transform (DRIFT) spectroscopy. Although other techniques are useful for specific classes of contaminants (such as hydrocarbons), DRIFT can be used most classes of contaminants. Surfaces such as the interior of open tubes or stems that are to be pinch welded can be inspected using infrared reflection spectroscopy. It must be demonstrated whether or not this tool can detect graphite based contamination, which has been seen in stems. For tubes with one closed end, the technique that should be investigated is emission infrared spectroscopy.
This paper considers the fundamentals of what happens in a solid when it is impacted by a medium-energy gallium ion. The study of the ion/sample interaction at the nanometer scale is applicable to most focused ion beam (FIB)–based work even if the FIB/sample interaction is only a step in the process, for example, micromachining or microelectronics device processing. Whereas the objective in other articles in this issue is to use the FIB tool to characterize a material or to machine a device or transmission electron microscopy sample, the goal of the FIB in this article is to have the FIB/sample interaction itself become the product. To that end, the FIB/sample interaction is considered in three categories according to geometry: below, at, and above the surface. First, the FIB ions can penetrate the top atom layer(s) and interact below the surface. Ion implantation and ion damage on flat surfaces have been comprehensively examined; however, FIB applications require the further investigation of high doses in three-dimensional profiles. Second, the ions can interact at the surface, where a morphological instability can lead to ripples and surface self-organization, which can depend on boundary conditions for site-specific and compound FIB processing. Third, the FIB may interact above the surface (and/or produce secondary particles that interact above the surface). Such ion beam–assisted deposition, FIB–CVD (chemical vapor deposition), offers an elaborate complexity in three dimensions with an FIB using a gas injection system. Finally, at the nanometer scale, these three regimes—below, at, and above the surface—can require an interdependent understanding to be judiciously controlled by the FIB.
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.
Weak link (WL)/strong link (SL) systems constitute important parts of the overall operational design of high consequence systems, with the SL system designed to permit operation of the system only under intended conditions and the WL system designed to prevent the unintended operation of the system under accident conditions. Degradation of the system under accident conditions into a state in which the WLs have not deactivated the system and the SLs have failed in the sense that they are in a configuration that could permit operation of the system is referred to as loss of assured safety. The probability of such degradation conditional on a specific set of accident conditions is referred to as probability of loss of assured safety (PLOAS). Previous work has developed computational procedures for the calculation of PLOAS under fire conditions for a system involving multiple WLs and SLs and with the assumption that a link fails instantly when it reaches its failure temperature. Extensions of these procedures are obtained for systems in which there is a temperature-dependent delay between the time at which a link reaches its failure temperature and the time at which that link actually fails.
This report summarizes the deliberations and conclusions of the Workshop on Programming Languages for High Performance Computing (HPCWPL) held at the Sandia CSRI facility in Albuquerque, NM on December 12-13, 2006.
As engineering challenges grow in the ever-shrinking world of nano-design, methods of making dynamic measurements of nano-materials and systems become more important. The Doppler electron velocimeter (DEV) is a new measurement concept motivated by the increasing importance of nano-dynamics. Nano-dynamics is defined in this context as any phenomenon that causes a dynamically changing phase in an electron beam, and includes traditional mechanical motion, as well as additional phenomena including changing magnetic and electric fields. The DEV is only a theoretical device at this point. Lastly, this article highlights the importance of pursuing nano-dynamics and presents a case that the electron microscope and its associated optics are a viable test bed to develop this new measurement tool.
Several groups of plastic molded CD4011s were electrically tested as part of an Army dormant storage program. These parts had been in storage in missile containers for 4.5 years, and were electrically tested annually. Eight of the parts (out of 1200) failed the electrical tests and were subsequently analyzed to determine the cause of the failures. The root cause was found to be corrosion of the unpassivated Al bondpads. No significant attack of the passivated Al traces was found. Seven of the eight failures occurred in parts stored on a pre-position ship (the Jeb Stuart), suggesting a link between the external environment and observed corrosion.
As the capabilities of numerical simulations increase, decision makers are increasingly relying upon simulations rather than experiments to assess risks across a wide variety of accident scenarios including fires. There are still, however, many aspects of fires that are either not well understood or are difficult to treat from first principles due to the computational expense. For a simulation to be truly predictive and to provide decision makers with information which can be reliably used for risk assessment the remaining physical processes must be studied and suitable models developed for the effects of the physics. The model for the fuel evaporation rate in a liquid fuel pool fire is significant because in well-ventilated fires the evaporation rate largely controls the total heat release rate from the fire. A set of experiments are outlined in this report which will provide data for the development and validation of models for the fuel regression rates in liquid hydrocarbon fuel fires. The experiments will be performed on fires in the fully turbulent scale range (> 1 m diameter) and with a number of hydrocarbon fuels ranging from lightly sooting to heavily sooting. The importance of spectral absorption in the liquid fuels and the vapor dome above the pool will be investigated and the total heat flux to the pool surface will be measured. The importance of convection within the liquid fuel will be assessed by restricting large scale liquid motion in some tests. These data sets will provide a sound, experimentally proven basis for assessing how much of the liquid fuel needs to be modeled to enable a predictive simulation of a fuel fire given the couplings between evaporation of fuel from the pool and the heat release from the fire which drives the evaporation.
Narasimhan Consulting Services, Inc. (NCS), under a contract with the Sandia National Laboratories (SNL), designed and operated pilot scale evaluations of the adsorption and coagulation/filtration treatment technologies aimed at meeting the recently revised arsenic maximum contaminant level (MCL) for drinking water. The standard of 10 {micro}g/L (10 ppb) is effective as of January 2006. The pilot demonstration is a project of the Arsenic Water Technology Partnership program, a partnership between the American Water Works Association Research Foundation (AwwaRF), SNL and WERC (A Consortium for Environmental Education and Technology Development). The pilot evaluation was conducted at Well 30 of the City of Weatherford, OK, which supplies drinking water to a population of more than 10,400. Well water contained arsenic in the range of 16 to 29 ppb during the study. Four commercially available adsorption media were evaluated side by side for a period of three months. Both adsorption and coagulation/filtration effectively reduced arsenic from Well No.30. A preliminary economic analysis indicated that adsorption using an iron oxide media was more cost effective than the coagulation/ filtration technology.
This tutorial is aimed at guiding a user through the process of performing a cable SGEMP simulation. The tutorial starts with processing a differential photon spectrum obtained from a Monte Carlo code such as ITS into a discrete (multi-group) spectrum used in CEPXS and CEPTRE. Guidance is given in the creation of a nite element mesh of the cable geometry. The set-up of a CEPTRE simulation is detailed. Users are instructed in evaluating the quality of the CEPTRE radiation transport results. The post-processing of CEPTRE results using Exostrip is detailed. And finally, an EMPHASIS/CABANA simulation is detailed including the interpretation of the output.
A standard approach to cross-language information retrieval (CLIR) uses Latent Semantic Analysis (LSA) in conjunction with a multilingual parallel aligned corpus. This approach has been shown to be successful in identifying similar documents across languages - or more precisely, retrieving the most similar document in one language to a query in another language. However, the approach has severe drawbacks when applied to a related task, that of clustering documents 'language-independently', so that documents about similar topics end up closest to one another in the semantic space regardless of their language. The problem is that documents are generally more similar to other documents in the same language than they are to documents in a different language, but on the same topic. As a result, when using multilingual LSA, documents will in practice cluster by language, not by topic. We propose a novel application of PARAFAC2 (which is a variant of PARAFAC, a multi-way generalization of the singular value decomposition [SVD]) to overcome this problem. Instead of forming a single multilingual term-by-document matrix which, under LSA, is subjected to SVD, we form an irregular three-way array, each slice of which is a separate term-by-document matrix for a single language in the parallel corpus. The goal is to compute an SVD for each language such that V (the matrix of right singular vectors) is the same across all languages. Effectively, PARAFAC2 imposes the constraint, not present in standard LSA, that the 'concepts' in all documents in the parallel corpus are the same regardless of language. Intuitively, this constraint makes sense, since the whole purpose of using a parallel corpus is that exactly the same concepts are expressed in the translations. We tested this approach by comparing the performance of PARAFAC2 with standard LSA in solving a particular CLIR problem. From our results, we conclude that PARAFAC2 offers a very promising alternative to LSA not only for multilingual document clustering, but also for solving other problems in cross-language information retrieval.