PyTrilinos is a set of Python interfaces to compiled Trilinos packages. This collection supports serial and parallel dense linear algebra, serial and parallel sparse linear algebra, direct and iterative linear solution techniques, algebraic and multilevel preconditioners, nonlinear solvers and continuation algorithms, eigensolvers and partitioning algorithms. Also included are a variety of related utility functions and classes, including distributed I/O, coloring algorithms and matrix generation. PyTrilinos vector objects are compatible with the popular NumPy Python package. As a Python front end to compiled libraries, PyTrilinos takes advantage of the flexibility and ease of use of Python, and the efficiency of the underlying C++, C and Fortran numerical kernels. This paper covers recent, previously unpublished advances in the PyTrilinos package.
ASME 2012 6th International Conference on Energy Sustainability, ES 2012, Collocated with the ASME 2012 10th International Conference on Fuel Cell Science, Engineering and Technology
Coupled thermal-mechanical, three-dimensional, finite-element analyses were used to evaluate generic design concepts for a repository in salt, for spent nuclear fuel and high-level waste. This work used heat generation by spent nuclear fuel (SNF) typical of that presently stored at reactor sites in the U.S. For waste packages containing 4-PWR SNF assemblies, the results show peak temperatures within previously identified ranges acceptable for salt media. Peak temperatures and maximum backfill consolidation occur at the package-salt interface. Significant consolidation of the backfill, and closure of the mined opening, is projected to continue after peak temperatures are realized. For larger 21-PWR SNF packages, the peak temperature could approach 450°C locally or lower, depending on the aging history of the fuel. This ongoing study suggests the feasibility of a SNF management strategy using decay storage and larger (e.g., 21-PWR) waste packages.
ASME 2012 6th International Conference on Energy Sustainability, ES 2012, Collocated with the ASME 2012 10th International Conference on Fuel Cell Science, Engineering and Technology
ASME 2012 6th International Conference on Energy Sustainability, ES 2012, Collocated with the ASME 2012 10th International Conference on Fuel Cell Science, Engineering and Technology
Hydrocarbon polymers, foams and nanocomposites are increasingly being subjected to extreme environments. Molecular scale modeling of these materials offers insight into failure mechanisms and complex response. Classical molecular dynamics (MD) simulations of the principal shock Hugoniot were conducted for two hydrocarbon polymers, polyethylene (PE) and poly(4-methyl-1-pentene) (PMP). We compare these results with recent density functional theory (DFT) calculations and experiments conducted at Sandia National Laboratories. Here, we extend these results to include low-density polymer foams using nonequilibrium MD techniques. We find good quantitative agreement with experiment. Further, we have measured local temperatures to investigate the formation of hot spots and polymer dissociation near foam voids.
Unidirectional carbon fiber reinforced epoxy composite samples were tested to determine the response to one dimensional shock loading. The material tested had high fiber content (68% by volume) and low porosity. Wave speeds for shocks traveling along the carbon fibers are significantly higher than for those traveling transverse to the fibers or through the bulk epoxy. As a result, the dynamic material response is dependent on the relative shock - fiber orientation. Shocks traveling along the fiber direction in uniaxial samples travel faster and exhibit both elastic and plastic characteristics over the stress range tested; up to 15 GPa. Results detail the anisotropic material response which is governed by different mechanisms along each of the two principle directions in the composite.
The longitudinal merging of wave packets and turbulent spots in a hypersonic boundary layer was studied on the nozzle wall of the Boeing/AFOSR Mach-6 Quiet Tunnel. Two pulsed glow perturbations were created in rapid succession to generate two closely spaced disturbances. The time between the perturbations was varied from run to run to simulate longitudinal merging. Preliminary results suggest that the growth of the trailing distur- bance seems to be suppressed by the presence of the leading disturbance. Conversely, the core of the leading disturbance appears unaffected by the presence of the trailing distur- bance and behaves as if isolated. This result is consistent with low-speed studies as well as DNS computations of longitudinal merging. However, the present results may be influ- enced by the perturber performance and therefore further studies of longitudinal merging are necessary to confirm the effect on the internal pressure structure of the interacting disturbances.
Impact is a phenomenon that is ubiquitous in mechanical design; however, the modeling of impacts in complex systems is often a simplified, imprecise process. In high fidelity finite element simulations, the number of elements required to accurately model the constitutive properties of an impact event is impractical. Consequently, rigid body dynamics with approximate representations of the impact dynamics are commonly used. These approximations can include a constant coefficient of restitution, penalty stiffness, or single degree of freedom constitutive model for the impact dynamics that is specific to the type of materials involved (elastic-plastic, viscoelastic, etc.). In order to understand the effect of the impact model on the system's dynamics, simulations investigate single degree of freedom and two degrees of freedom systems with rigid stops limiting the amplitude of vibration. Five contact models are considered: a coefficient of restitution, penalty stiffness, two similar elastic-plastic constitutive models, and a dissimilar elastic-plastic constitutive model. Frequency sweeps show that simplified contact models can lead to incorrect assessments of the system's dynamics and stability, which can significantly affect the prediction of wear and damage in the system.
Christopher J. Orendorff shares his views on the role of separators in lithium-ion cell safety. One of the most critically important cell components to ensure cell safety is the separator, which is a thin porous membrane that physically separates the anode and cathode. The main function of the separator is to prevent physical contact between the anode and cathode, while facilitating ion transport in the cell. The challenge with designing safe battery separators is the trade-off between mechanical robustness and porosity/transport properties. Most commercially available nonaqueous lithium-ion separators designed for small batteries are single layer made of polyoleins. Many of the multilayer separators are designed with a shutdown feature where two of the layers have different phase transition temperatures. The lower melting component melts and fills the pores of the other solid layer and stops ion transport and current low in the cell, as the temperature of a cell increases.
This paper provides an analysis of high-pressure phenomena and its potential effects on the fundamental physics of fuel injection in Diesel engines. We focus on conditions when cylinder pressures exceed the thermodynamic critical pressure of the injected fuel and describe the major differences that occur in the jet dynamics compared to that described by classical spray theory. To facilitate the analysis, we present a detailed model framework based on the Large Eddy Simulation (LES) technique that is designed to account for key high-pressure phenomena. Using this framework, we perform a detailed analysis using the experimental data posted as part of the Engine Combustion Network (see www.sandia.gov/ECN): namely the "Baseline n-heptane" and "Spray-A (n-dodecane)" cases, which are designed to emulate conditions typically observed in Diesel engines. Calculations are performed by rigorously treating the experimental geometry, operating conditions and relevant thermo-physical gas-liquid mixture properties. Results are further processed using linear gradient theory, which facilitates calculations of detailed vapor-liquid interfacial structures, and compared with the high-speed imaging data. Analysis of the data reveals that fuel enters the chamber as a compressed liquid and is heated at supercritical pressure. Further analysis suggests that, at certain conditions studied here, the classical view of spray atomization as an appropriate model is questionable. Instead, nonideal real-fluid behavior must be taken into account using a multicomponent formulation that applies to arbitrary hydrocarbon mixtures at high-pressure supercritical conditions.
While DNA sequencing technology is advancing at an unprecedented rate, sample preparation technology still relies primarily on manual bench-top processes, which often can be slow, labor-intensive, inefficient, or inconsistent. To address these disadvantages, we developed an integrated microfluidic platform for automated preparation of DNA libraries for next generation sequencing. This sample-to-answer system has great potential for rapid characterization of novel and emerging pathogens from clinical samples.
Advances in Cognitive Engineering and Neuroergonomics
Liao, Huafei; Bone, Alysia; Coyne, Kevin; Forester, John
Human reliability analysis (HRA) is used in the context of probabilistic risk assessment (PRA) to provide risk information regarding human performance to support risk-informed decision-making with respect to high-reliability industries. In the current state of the art of HRA, variability in HRA results is still a significant issue, which in turn contributes to uncertainty in PRA results. The existence and use of different HRA methods that rely on different assumptions, human performance frameworks, quantification algorithms, and data, as well as inconsistent implementation from analysts, appear to be the most common sources for the issue, and such issue has raised concerns over the robustness of HRA methods. In two large scale empirical studies (Bye et al., 2012; Forester et al., 2012), the Accident Sequence Evaluation Program (ASEP) HRA Procedure, along with other HRA methods, was used to obtain HRA predictions for the human failure events (HFEs) in accident scenarios. The predictions were then compared with empirical crew performance data from nuclear power plant (NPP) simulators by independent assessors to examine the reasonableness of the predictions. This paper first provides a brief overview of the study methodology and results, and then discusses the study findings with respect to ASEP and their implications in the context of challenges to HRA in general.
ASME 2012 6th International Conference on Energy Sustainability, ES 2012, Collocated with the ASME 2012 10th International Conference on Fuel Cell Science, Engineering and Technology
ASME 2012 6th International Conference on Energy Sustainability, ES 2012, Collocated with the ASME 2012 10th International Conference on Fuel Cell Science, Engineering and Technology
International Defense and Homeland Security Simulation Workshop, DHSS 2012, Held at the International Multidisciplinary Modeling and Simulation Multiconference, I3M 2012
The Advanced Concepts Group at Sandia National Laboratory and the Consortium for Science, Policy and Outcomes at Arizona State University convened a workshop in May 2006 to explore the potential policy implications of technologies that might enhance human cognitive abilities. The group's deliberations sought to identify core values and concerns raised by the prospect of cognitive enhancement. The workshop focused on the policy implications of various prospective cognitive enhancements and on the technologies/nanotechnology, biotechnology, information technology, and cognitive science--that enable them. The prospect of rapidly emerging technological capabilities to enhance human cognition makes urgent a daunting array of questions, tensions, ambitions, and concerns. The workshop elicited dilemmas and concerns in ten overlapping areas: science and democracy; equity and justice; freedom and control; intergenerational issues; ethics and competition; individual and community rights; speed and deliberations; ethical uncertainty; humanness; and sociocultural risk. We identified four different perspectives to encompass the diverse issues related to emergence of cognitive enhancement technologies: (1) Laissez-faire--emphasizes freedom of individuals to seek and employ enhancement technologies based on their own judgment; (2) Managed technological optimism--believes that while these technologies promise great benefits, such benefits cannot emerge without an active government role; (3) Managed technological skepticism--views that the quality of life arises more out of society's institutions than its technologies; and (4) Human Essentialism--starts with the notion of a human essence (whether God-given or evolutionary in origin) that should not be modified. While the perspectives differ significantly about both human nature and the role of government, each encompasses a belief in the value of transparency and reliable information that can allow public discussion and decisions about cognitive enhancement. The practical question is how to foster productive discussions in a society whose attention is notably fragmented and priorities notably diverse. The question of what to talk about remains central, as each of the four perspectives is concerned about different things. Perhaps the key issue for initial clarification as a condition for productive democratic discussion has to do with the intended goals of cognitive enhancement, and the mechanisms for allowing productive deliberation about these goals.
Over the last several years, there has been considerable growth in camera based observation systems for a variety of safety, scientific, and recreational applications. In order to improve the effectiveness of these systems, we frequently desire the ability to increase the number of observed objects, but solving this problem is not as simple as adding more cameras. Quite often, there are economic or physical restrictions that prevent us from adding additional cameras to the system. As a result, we require methods that coordinate the tracking of objects between multiple cameras in an optimal way. In order to accomplish this goal, we present a new cooperative control algorithm for a camera based observational system. Specifically, we present a receding horizon control where we model the underlying optimal control problem as a mixed integer linear program. The benefit of this design is that we can coordinate the actions between each camera while simultaneously respecting its kinematics. In addition, we further improve the quality of our solution by coupling our algorithm with a Kalman filter. Through this integration, we not only add a predictive component to our control, but we use the uncertainty estimates provided by the filter to encourage the system to periodically observe any outliers in the observed area. This combined approach allows us to intelligently observe the entire region of interest in an effective and thorough manner.
Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia's capabilities to support engineering sciences. This capability is based on amending experimental data with information gained from computational investigations, in parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A prominent materials area where such computational investigations are hard to perform today because of limited accuracy is actinide and lanthanide materials. The Science of Extreme Environment Lab Directed Research and Development project described in this Report has had the aim to cure this accuracy problem. We have focused on the two major factors which would allow for accurate computational investigations of actinide and lanthanide materials: (1) The fully relativistic treatment needed for materials containing heavy atoms, and (2) the needed improved performance of DFT exchange-correlation functionals. We have implemented a fully relativistic treatment based on the Dirac Equation into the LANL code RSPt and we have shown that such a treatment is imperative when calculating properties of materials containing actinides and/or lanthanides. The present standard treatment that only includes some of the relativistic terms is not accurate enough and can even give misleading results. Compared to calculations previously considered state of the art, the Dirac treatment gives a substantial change in equilibrium volume predictions for materials with large spin-orbit coupling. For actinide and lanthanide materials, a Dirac treatment is thus a fundamental requirement in any computational investigation, including those for DFT-based EOS construction. For a full capability, a DFT functional capable of describing strongly correlated systems such as actinide materials need to be developed. Using the previously successful subsystem functional scheme developed by Mattsson et.al., we have created such a functional. In this functional the Harmonic Oscillator Gas is providing the necessary reference system for the strong correlation and localization occurring in actinides. Preliminary testing shows that the new Hao-Armiento-Mattsson (HAM) functional gives a trend towards improved results for the crystalline copper oxide test system we have chosen. This test system exhibits the same exchange-correlation physics as the actinide systems do, but without the relativistic effects, giving access to a pure testing ground for functionals. During the work important insights have been gained. An example is that currently available functionals, contrary to common belief, make large errors in so called hybridization regions where electrons from different ions interact and form new states. Together with the new understanding of functional issues, the Dirac implementation into the RSPt code will permit us to gain more fundamental understanding, both quantitatively and qualitatively, of materials of importance for Sandia and the rest of the Nuclear Weapons complex.
This report details the current progress in the design, implementation, and validation of the signal conditioning circuitry used in a measurement instrumentation system. The purpose of this text is to document the current progress of a particular design in signal conditioning circuitry in an instrumentation system. The input of the signal conditioning circuitry comes from a piezoresistive transducer and the output will be fed to a 250 ksps, 12-bit analog-to-digital converter (ADC) with an input range of 0-5 V. It is assumed that the maximum differential voltage amplitude input from the sensor is 20 mV with an unknown, but presumably high, sensor bandwidth. This text focuses on a specific design; however, the theory is presented in such a way that this text can be used as a basis for future designs.
This Lab-Directed Research and Development (LDRD) sought to develop technology that enhances scenario construction speed, entity behavior robustness, and scalability in Live-Virtual-Constructive (LVC) simulation. We investigated issues in both simulation architecture and behavior modeling. We developed path-planning technology that improves the ability to express intent in the planning task while still permitting an efficient search algorithm. An LVC simulation demonstrated how this enables 'one-click' layout of squad tactical paths, as well as dynamic re-planning for simulated squads and for real and simulated mobile robots. We identified human response latencies that can be exploited in parallel/distributed architectures. We did an experimental study to determine where parallelization would be productive in Umbra-based force-on-force (FOF) simulations. We developed and implemented a data-driven simulation composition approach that solves entity class hierarchy issues and supports assurance of simulation fairness. Finally, we proposed a flexible framework to enable integration of multiple behavior modeling components that model working memory phenomena with different degrees of sophistication.
This document reports on the research of Kenneth Letendre, the recipient of a Sandia Graduate Research Fellowship at the University of New Mexico. Warfare is an extreme form of intergroup competition in which individuals make extreme sacrifices for the benefit of their nation or other group to which they belong. Among animals, limited, non-lethal competition is the norm. It is not fully understood what factors lead to warfare. We studied the global variation in the frequency of civil conflict among countries of the world, and its positive association with variation in the intensity of infectious disease. We demonstrated that the burden of human infectious disease importantly predicts the frequency of civil conflict and tested a causal model for this association based on the parasite-stress theory of sociality. We also investigated the organization of social foraging by colonies of harvester ants in the genus Pogonomyrmex, using both field studies and computer models.
A lightning flash consists of multiple, high-amplitude but short duration return strokes. Between the return strokes is a lower amplitude, continuing current which flows for longer duration. If the walls of a Faraday cage are made of thin enough metal, the continuing current can melt a hole through the metal in a process called burnthrough. A subsequent return stroke can couple energy through this newly-formed hole. This LDRD is a study of the protection provided by a Faraday cage when it has been compromised by burnthrough. We initially repeated some previous experiments and expanded on them in terms of scope and diagnostics to form a knowledge baseline of the coupling phenomena. We then used a combination of experiment, analysis and numerical modeling to study four coupling mechanisms: indirect electric field coupling, indirect magnetic field coupling, conduction through plasma and breakdown through the hole. We discovered voltages higher than those encountered in the previous set of experiments (on the order of several hundreds of volts).
Since May 2010, we have been recording continuous seismic data at Sandia's FACT site. The collected signals provide us with a realistic archive for testing algorithms under development for local monitoring of explosive testing. Numerous small explosive tests are routinely conducted around Kirtland AFB by different organizations. Our goal is to identify effective methods for distinguishing these events from normal daily activity on and near the base, such as vehicles, aircraft, and storms. In this report, we describe the recording system, and present some observations of the varying ambient noise conditions at FACT. We present examples of various common, non-explosive, sources. Next we show signals from several small explosions, and discuss their characteristic features.
Fuller, Thomas F.; Bandhauer, Todd; Garimella, Srinivas
A fully coupled electrochemical and thermal model for lithium-ion batteries is developed to investigate the impact of different thermal management strategies on battery performance. In contrast to previous modeling efforts focused either exclusively on particle electrochemistry on the one hand or overall vehicle simulations on the other, the present work predicts local electrochemical reaction rates using temperature-dependent data on commercially available batteries designed for high rates (C/LiFePO4) in a computationally efficient manner. Simulation results show that conventional external cooling systems for these batteries, which have a low composite thermal conductivity (~1 W/m-K), cause either large temperature rises or internal temperature gradients. Thus, a novel, passive internal cooling system that uses heat removal through liquid-vapor phase change is developed. Although there have been prior investigations of phase change at the microscales, fluid flow at the conditions expected here is not well understood. A first-principles based cooling system performance model is developed and validated experimentally, and is integrated into the coupled electrochemical-thermal model for assessment of performance improvement relative to conventional thermal management strategies. The proposed cooling system passively removes heat almost isothermally with negligible thermal resistances between the heat source and cooling fluid. Thus, the minimization of peak temperatures and gradients within batteries allow increased power and energy densities unencumbered by thermal limitations.
We report the development of new experimental capabilities and ab initio modeling for real-time studies of Li-ion battery electrochemical reactions. We developed three capabilities for in-situ transmission electron microscopy (TEM) studies: a capability that uses a nanomanipulator inside the TEM to assemble electrochemical cells with ionic liquid or solid state electrolytes, a capability that uses on-chip assembly of battery components on to TEM-compatible multi-electrode arrays, and a capability that uses a TEM-compatible sealed electrochemical cell that we developed for performing in-situ TEM using volatile battery electrolytes. These capabilities were used to understand lithiation mechanisms in nanoscale battery materials, including SnO2, Si, Ge, Al, ZnO, and MnO2. The modeling approaches used ab initio molecular dynamics to understand early stages of ethylene carbonate reduction on lithiated-graphite and lithium surfaces and constrained density functional theory to understand ethylene carbonate reduction on passivated electrode surfaces.
This document is a final report for the polyvinyl toluene (PVT) neutron-gamma (PVT-NG) project, which was sponsored by the Domestic Nuclear Detection Office (DNDO). The PVT-NG sensor uses PVT detectors for both gamma and neutron detection. The sensor exhibits excellent spectral resolution and gain stabilization, which are features that are beneficial for detection of both gamma-ray and neutron sources. In fact, the ability to perform isotope identification based on spectra that were measured by the PVT-NG sensor was demonstrated. As described in a previous report, the neutron sensitivity of the first version of the prototype was about 25% less than the DNDO requirement of 2.5 cps/ng for bare 252Cf. This document describes design modifications that were expected to improve the neutron sensitivity by about 50% relative to the PVT-NG prototype. However, the project was terminated before execution of the design modifications after portal vendors demonstrated other technologies that enable neutron detection without the use of 3He. Nevertheless, the PVT-NG sensor development demonstrated several performance goals that may be useful in future portal designs.
Uncertainty quantification (UQ) methods bring rigorous statistical connections to the analysis of computational and experiment data, and provide a basis for probabilistically assessing margins associated with safety and reliability. The DAKOTA toolkit developed at Sandia National Laboratories implements a number of UQ methods, which are being increasingly adopted by modeling and simulation teams to facilitate these analyses. This report disseminates results as to the performance of DAKOTA's stochastic expansion methods for UQ on a representative application. Our results provide a number of insights that may be of interest to future users of these methods, including the behavior of the methods in estimating responses at varying probability levels, and the expansion levels for the methodologies that may be needed to achieve convergence.
The Department of Defense (DoD) provides its standard for information assurance in its Instruction 8500.2, dated February 6, 2003. This Instruction lists 157 'IA Controls' for nine 'baseline IA levels.' Aside from distinguishing IA Controls that call for elevated levels of 'robustness' and grouping the IA Controls into eight 'subject areas' 8500.2 does not examine the nature of this set of controls, determining, for example, which controls do not vary in robustness, how this set of controls compares with other such sets, or even which controls are required for all nine baseline IA levels. This report analyzes (1) the IA Controls, (2) the subject areas, and (3) the Baseline IA levels. For example, this report notes that there are only 109 core IA Controls (which this report refers to as 'ICGs'), that 43 of these core IA Controls apply without variation to all nine baseline IA levels and that an additional 31 apply with variations. This report maps the IA Controls of 8500.2 to the controls in NIST 800-53 and ITGI's CoBIT. The result of this analysis and mapping, as shown in this report, serves as a companion to 8500.2. (An electronic spreadsheet accompanies this report.)
Micro-Gas-Analyzers have many applications in detecting chemical compounds present in the air. MEMS valves are used to perform sampling of gasses, as they enable control of fluid flow at the micro level. Current generation electrostatically actuated MEMS valves were tested to determine their ability to hold off a given gauge pressure with an applied voltage. Current valve designs were able to hold off 98 psi with only 82 V applied to the valves. The valves were determined to be 1.83 times more efficient than older valve designs, due to increasing the electrostatic area of the valve and trapping oxide between polysilicon layers. Newer valve designs were also proposed and modeled using ANSYS multiphysics, which should be able to hold off 100 psi with only 29 V needed. This performance would be 2.82 times more efficient than current designs, or 5.17 times more efficient than older valve designs. This will be accomplished by further increasing the valve radius and decreasing the gap between the valve boss and electrode.
In this work, we demonstrated engineered modification of propagation of thermal phonons, i.e. at THz frequencies, using phononic crystals. This work combined theoretical work at Sandia National Laboratories, the University of New Mexico, the University of Colorado Boulder, and Carnegie Mellon University; the MESA fabrication facilities at Sandia; and the microfabrication facilities at UNM to produce world-leading control of phonon propagation in silicon at frequencies up to 3 THz. These efforts culminated in a dramatic reduction in the thermal conductivity of silicon using phononic crystals by a factor of almost 30 as compared with the bulk value, and about 6 as compared with an unpatterned slab of the same thickness.
The Ground-Based Monitoring R and E Component Evaluation project performs testing on the hardware components that make up Seismic and Infrasound monitoring systems. The majority of the testing is focused on the Digital Waveform Recorder (DWR), Seismic Sensor, and Infrasound Sensor. The software tool used to capture and analyze the data collected from testing is called TALENT: Test and Analysis Evaluation Tool. This document is the manual for using TALENT. Other reports document the testing procedures that are in place (Kromer, 2007) and the algorithms employed in the test analysis (Merchant, 2011).
Ceragenins were used to create biofouling resistant water-treatment membranes. Ceragenins are synthetically produced antimicrobial peptide mimics that display broad-spectrum bactericidal activity. While ceragenins have been used on bio-medical devices, use of ceragenins on water-treatment membranes is novel. Biofouling impacts membrane separation processes for many industrial applications such as desalination, waste-water treatment, oil and gas extraction, and power generation. Biofouling results in a loss of permeate flux and increase in energy use. Creation of biofouling resistant membranes will assist in creation of clean water with lower energy usage and energy with lower water usage. Five methods of attaching three different ceragenin molecules were conducted and tested. Biofouling reduction was observed in the majority of the tests, indicating the ceragenins are a viable solution to biofouling on water treatment membranes. Silane direct attachment appears to be the most promising attachment method if a high concentration of CSA-121a is used. Additional refinement of the attachment methods are needed in order to achieve our goal of several log-reduction in biofilm cell density without impacting the membrane flux. Concurrently, biofilm forming bacteria were isolated from source waters relevant for water treatment: wastewater, agricultural drainage, river water, seawater, and brackish groundwater. These isolates can be used for future testing of methods to control biofouling. Once isolated, the ability of the isolates to grow biofilms was tested with high-throughput multiwell methods. Based on these tests, the following species were selected for further testing in tube reactors and CDC reactors: Pseudomonas ssp. (wastewater, agricultural drainage, and Colorado River water), Nocardia coeliaca or Rhodococcus spp. (wastewater), Pseudomonas fluorescens and Hydrogenophaga palleronii (agricultural drainage), Sulfitobacter donghicola, Rhodococcus fascians, Rhodobacter katedanii, and Paracoccus marcusii (seawater), and Sphingopyxis spp. (groundwater). The testing demonstrated the ability of these isolates to be used for biofouling control testing under laboratory conditions. Biofilm forming bacteria were obtained from all the source water samples.
Significant deformation of thin films occurs when measuring thickness by mechanical means. This source of measurement error can lead to underestimating film thickness if proper corrections are not made. Analytical solutions exist for Hertzian contact deformation, but these solutions assume relatively large geometries. If the film being measured is thin, the analytical Hertzian assumptions are not appropriate. ANSYS is used to model the contact deformation of a 48 gauge Mylar film under bearing load, supported by a stiffer material. Simulation results are presented and compared to other correction estimates. Ideal, semi-infinite, and constrained properties of the film and the measurement tools are considered.