Current technology cuts solar Si wafers by a wire saw process, resulting in 50% 'kerf' loss when machining silicon from a boule or brick into a wafer. We want to develop a kerf-free laser wafering technology that promises to eliminate such wasteful wire saw processes and achieve up to a ten-fold decrease in the g/W{sub p} (grams/peak watt) polysilicon usage from the starting polysilicon material. Compared to today's technology, this will also reduce costs ({approx}20%), embodied energy, and green-house gas GHG emissions ({approx}50%). We will use short pulse laser illumination sharply focused by a solid immersion lens to produce subsurface damage in silicon such that wafers can be mechanically cleaved from a boule or brick. For this concept to succeed, we will need to develop optics, lasers, cleaving, and high throughput processing technologies capable of producing wafers with thicknesses < 50 {micro}m with high throughput (< 10 sec./wafer). Wafer thickness scaling is the 'Moore's Law' of silicon solar. Our concept will allow solar manufacturers to skip entire generations of scaling and achieve grid parity with commercial electricity rates. Yet, this idea is largely untested and a simple demonstration is needed to provide credibility for a larger scale research and development program. The purpose of this project is to lay the groundwork to demonstrate the feasibility of laser wafering. First, to design and procure on optic train suitable for producing subsurface damage in silicon with the required damage and stress profile to promote lateral cleavage of silicon. Second, to use an existing laser to produce subsurface damage in silicon, and third, to characterize the damage using scanning electron microscopy and confocal Raman spectroscopy mapping.
The annual program report provides detailed information about all aspects of the SNL/California Environmental Monitoring Program. It functions as supporting documentation to the SNL/California Environmental Management System Program Manual. The 2010 program report describes the activities undertaken during the previous year, and activities planned in future years to implement the Environmental Monitoring Program, one of six programs that supports environmental management at SNL/California.
The Lightweight Integrating Multi-physics Environment (LIME) is a software package for creating multi-physics simulation codes. Its primary application space is when computer codes are currently available to solve different parts of a multi-physics problem and now need to be coupled with other such codes. In this report we define a common domain language for discussing multi-physics coupling and describe the basic theory associated with multiphysics coupling algorithms that are to be supported in LIME. We provide an assessment of coupling techniques for both steady-state and time dependent coupled systems. Example couplings are also demonstrated.
The development and operational sustainment of renewable (geothermal) and non-renewable (fossil fuel) energy resources will be accompanied by increasingly higher costs factors: exploration and site preparation, operational maintenance and repair. Increased government oversight in the wake of the Gulf oil spill will only add to the cost burden. It is important to understand that downhole conditions are not just about elevated temperatures. It is often construed that military electronics are exposed to the upper limit in terms of extreme service environments. Probably the harshest of all service conditions for electronics and electrical equipment are those in oil, gas, and geothermal wells. From the technology perspective, advanced materials, sensors, and microelectronics devices are benefificial to the exploration and sustainment of energy resources, especially in terms of lower costs. Besides the need for the science that creates these breakthroughs - there is also a need for sustained engineering development and testing. Downhole oil, gas, and geothermal well applications can have a wide range of environments and reliability requirements: Temperature, Pressure, Vibration, Corrosion, and Service duration. All too frequently, these conditions are not well-defifined because the application is labeled as 'high temperature'. This ambiguity is problematic when the investigation turns to new approaches for electronic packaging solutions. The objective is to develop harsh environment, electronic packaging that meets customer requirements of cost, performance, and reliability. There are a number of challenges: (1) Materials sets - solder alloys, substrate materials; (2) Manufacturing process - low to middle volumes, low defect counts, new equipment technologies; and (3) Reliability testing - requirements documents, test methods and modeling, relevant standards documents. The cost to develop and sustain renewable and non-renewable energy resources will continue to escalate within the industry. Downhole electronics can provide a very cost-effective approach for well exploration and sustainment (data logging). However, the harsh environments are a 'game-changer' in terms defining materials, assembly processes and the long-term reliability of downhole electronic systems. The system-level approach will enable the integration of each of these contributors - materials, processes, and reliability - in order to deliver cost-effective electronics that meet customer requirements.
Sandia National Laboratories (SNL) developed the Integrated Verification System Evaluation Model (IVSEM) to estimate the performance of the International Monitoring System (IMS) operated by the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). IVSEM was developed in several phases between 1995 and 2000. The model was developed in FORTRAN with an IDL-based user interface and was compiled for Windows and UNIX operating systems. Continuing interest in this analysis capability, coupled with numerous advances in desktop computer hardware and software since IVSEM was written, enabled significant improvements to IVSEM run-time performance and data analysis capabilities. These improvements were implemented externally without modifying the FORTRAN executables, which had been previously verified. This paper describes the parallelization approach developed to significantly reduce IVSEM run-times and the new test setup and analysis tools developed to facilitate better IVSEM operation.
This document summarizes the work done in our three-year LDRD project titled 'Physics of Intense, High Energy Radiation Effects.' This LDRD is focused on electrical effects of ionizing radiation at high dose-rates. One major thrust throughout the project has been the radiation-induced conductivity (RIC) produced by the ionizing radiation. Another important consideration has been the electrical effect of dose-enhanced radiation. This transient effect can produce an electromagnetic pulse (EMP). The unifying theme of the project has been the dielectric function. This quantity contains much of the physics covered in this project. For example, the work on transient electrical effects in radiation-induced conductivity (RIC) has been a key focus for the work on the EMP effects. This physics in contained in the dielectric function, which can also be expressed as a conductivity. The transient defects created during a radiation event are also contained, in principle. The energy loss lead the hot electrons and holes is given by the stopping power of ionizing radiation. This information is given by the inverse dielectric function. Finally, the short time atomistic phenomena caused by ionizing radiation can also be considered to be contained within the dielectric function. During the LDRD, meetings about the work were held every week. These discussions involved theorists, experimentalists and engineers. These discussions branched out into the work done in other projects. For example, the work on EMP effects had influence on another project focused on such phenomena in gases. Furthermore, the physics of radiation detectors and radiation dosimeters was often discussed, and these discussions had impact on related projects. Some LDRD-related documents are now stored on a sharepoint site (https://sharepoint.sandia.gov/sites/LDRD-REMS/default.aspx). In the remainder of this document the work is described in catergories but there is much overlap between the atomistic calculations, the continuum calculations and the experiments.
This report describes our evaluation of the T-Plan Integrator software application as it was used to transfer a real data set from the Teamcenter for Systems Engineering (TcSE) software application to the DOORS software application. The T-Plan Integrator was evaluated to determine if it would meet the needs of Sandia National Laboratories to migrate our existing data sets from TcSE to DOORS. This report presents the struggles of migrating data and focuses on how the Integrator can be used to map a data set and its data architecture from TcSE to DOORS. Finally, this report describes how the bulk of the migration can take place using the Integrator; however, about 20-30% of the data would need to be transferred from TcSE to DOORS manually. This report does not evaluate the transfer of data from DOORS to TcSE.
Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.
An important part of velocimetry analysis is the recovery of a known velocity history from simulated data signals. The fringen program synthesizes VISAR and PDV signals, given a specified velocity history, using exact formulations for the optical signal. Time-dependent light conditions, non-ideal measurement conditions, and various diagnostic limitations (noise, etc.) may be incorporated into the simulated signals. This report describes the fringen program, which performs forward VISAR (Velocity Interferometer System for Any Reflector) and PDV (Photonic Doppler Velocimetry, also known as heterodyne velocimetry) analysis. Nearly all effects that might occur in VISAR/PDV measurement of a single velocity can be modeled by fringen. The program operates in MATLAB, either within a graphical interface or as a user-callable function. The current stable version of fringen is 0.3, which was released in October 2010. The following sections describe the operation and use of fringen. Section 2 gives a brief overview of VISAR and PDV synthesis. Section 3 illustrates the graphical and console interface of fringen. Section 4 presents several example uses of the program. Section 5 summarizes program capabilities and discusses potential future work.
A multifunctional reactor is a chemical engineering device that exploits enhanced heat and mass transfer to promote production of a desired chemical, combining more than one unit operation in a single system. The main component of the reactor system under study here is a vertical column containing packing material through which liquid(s) and gas flow cocurrently downward. Under certain conditions, a range of hydrodynamic regimes can be achieved within the column that can either enhance or inhibit a desired chemical reaction. To study such reactors in a controlled laboratory environment, two experimental facilities were constructed at Sandia National Laboratories. One experiment, referred to as the Two-Phase Experiment, operates with two phases (air and water). The second experiment, referred to as the Three-Phase Experiment, operates with three phases (immiscible organic liquid and aqueous liquid, and nitrogen). This report describes the motivation, design, construction, operational hazards, and operation of the both of these experiments. Data and conclusions are included.
The convergence of micro-/nano-electromechanical systems (MEMS/NEMS) and biomedical industries is creating a need for innovation and discovery around materials, particularly in miniaturized systems that use polymers as the primary substrate. Polymers are ubiquitous in the microelectronics industry and are used as sensing materials, lithography tools, replication molds, microfluidics, nanofluidics, and biomedical devices. This diverse set of operational requirements dictates that the materials employed must possess different properties in order to reduce the cost of production, decrease the scale of devices to the appropriate degree, and generate engineered devices with new functional properties at cost-competitive levels of production. Nanoscale control of polymer deformation at a massive scale would enable breakthroughs in all of the aforementioned applications, but is currently beyond the current capabilities of mass manufacturing. This project was focused on developing a fundamental understanding of how polymers behave under different loads and environments at the nanoscale in terms of performance and fidelity in order to fill the most critical gaps in our current knowledgebase on this topic.