Publications

Results 94576–94600 of 99,299

Search results

Jump to search filters

FIDAP capabilities for solving problems with stiff chemistry

Torczynski, John R.

In support of the Motorola CRADA, the capabilities of the computational fluid dynamics code FIDAP (Fluid Dynamics International) for simulating problems involving fluid flow, heat transport, and chemical reactions have been assessed and enhanced as needed for semiconductor-processing applications (e.g. chemical vapor deposition). A novel method of treating surface chemical species that uses only pre-existing FIDAP commands is described and illustrated with test problems. A full-Jacobian treatment of the chemical reaction rate expressions during formation of the stiffness matrix has been implemented in FIDAP for both the Arrhenius-parameter and user-subroutine methods of specifying chemical reactions, where the Jacobian terms can be calculated analytically or numerically. This formulation is needed to obtain convergence when reaction rates become large compared to transport rates (stiff chemistry). Several test problems are analyzed, and in all cases this approach yields good convergence behavior, even for extremely stiff fluid-phase and surface reactions. A stiff segregated algorithm has been developed and implemented in FIDAP. Analysis of test problems indicates that this algorithm yields improved convergence behavior compared with the original segregated algorithm. This improved behavior enables segregated techniques to be applied to problems with stiff chemistry, as required for large three-dimensional multi-species problems.

More Details

Current limiters

Loescher, Douglas H.

The current that flows between the electrical test equipment and the nuclear explosive must be limited to safe levels during electrical tests conducted on nuclear explosives at the DOE Pantex facility. The safest way to limit the current is to use batteries that can provide only acceptably low current into a short circuit; unfortunately this is not always possible. When it is not possible, current limiters, along with other design features, are used to limit the current. Three types of current limiters, the fuse blower, the resistor limiter, and the MOSFET-pass-transistor limiters, are used extensively in Pantex test equipment. Detailed failure mode and effects analyses were conducted on these limiters. Two other types of limiters were also analyzed. It was found that there is no best type of limiter that should be used in all applications. The fuse blower has advantages when many circuits must be monitored, a low insertion voltage drop is important, and size and weight must be kept low. However, this limiter has many failure modes that can lead to the loss of over current protection. The resistor limiter is simple and inexpensive, but is normally usable only on circuits for which the nominal current is less than a few tens of milliamperes. The MOSFET limiter can be used on high current circuits, but it has a number of single point failure modes that can lead to a loss of protective action. Because bad component placement or poor wire routing can defeat any limiter, placement and routing must be designed carefully and documented thoroughly.

More Details

Transient dynamics simulations: Parallel algorithms for contact detection and smoothed particle hydrodynamics

Hendrickson, Bruce A.

Transient dynamics simulations are commonly used to model phenomena such as car crashes, underwater explosions, and the response of shipping containers to high-speed impacts. Physical objects in such a simulation are typically represented by Lagrangian meshes because the meshes can move and deform with the objects as they undergo stress. Fluids (gasoline, water) or fluid-like materials (earth) in the simulation can be modeled using the techniques of smoothed particle hydrodynamics. Implementing a hybrid mesh/particle model on a massively parallel computer poses several difficult challenges. One challenge is to simultaneously parallelize and load-balance both the mesh and particle portions of the computation. A second challenge is to efficiently detect the contacts that occur within the deforming mesh and between mesh elements and particles as the simulation proceeds. These contacts impart forces to the mesh elements and particles which must be computed at each timestep to accurately capture the physics of interest. In this paper we describe new parallel algorithms for smoothed particle hydrodynamics and contact detection which turn out to have several key features in common. Additionally, we describe how to join the new algorithms with traditional parallel finite element techniques to create an integrated particle/mesh transient dynamics simulation. Our approach to this problem differs from previous work in that we use three different parallel decompositions, a static one for the finite element analysis and dynamic ones for particles and for contact detection. We have implemented our ideas in a parallel version of the transient dynamics code PRONTO-3D and present results for the code running on a large Intel Paragon.

More Details

Information integrity and privacy for computerized medical patient records

Gallegos, Joselyne O.

Sandia National Laboratories and Oceania, Inc. entered into a Cooperative Research and Development Agreement (CRADA) in November 1993 to provide ``Information Integrity and Privacy for Computerized Medical Patient Records`` (CRADA No. SC93/01183). The main objective of the project was to develop information protection methods that are appropriate for databases of patient records in health information systems. This document describes the findings and alternative solutions that resulted from this CRADA.

More Details

PHASER 2.10 methodology for dependence, importance, and sensitivity: The role of scale factors, confidence factors, and extremes

Cooper, James A.

PHASER (Probabilistic Hybrid Analytical System Evaluation Routine) is a software tool that has the capability of incorporating subjective expert judgment into probabilistic safety analysis (PSA) along with conventional data inputs. An earlier report described the PHASER methodology, but only gave a cursory explanation about how dependence was incorporated in Version 1.10 and about how ``Importance`` and ``Sensitivity`` measures were to be incorporated in Version 2.00. A more detailed description is given in this report. The basic concepts involve scale factors and confidence factors that are associated with the stochastic variability and subjective uncertainty (which are common adjuncts used in PSA), and the safety risk extremes that are crucial to safety assessment. These are all utilized to illustrate methodology for incorporating dependence among analysis variables in generating PSA results, and for Importance and Sensitivity measures associated with the results that help point out where any major sources of safety concern arise and where any major sources of uncertainty reside, respectively.

More Details

A progress report on the LDRD project entitled {open_quotes}Microelectronic silicon-based chemical sensors: Ultradetection of high value molecules{close_quotes}

Hughes, Robert C.

This work addresses a new kind of silicon based chemical sensor that combines the reliability and stability of silicon microelectronic field effect devices with the highly selective and sensitive immunoassay. The sensor works on the principle that thin SiN layers on lightly doped Si can detect pH changes rapidly and reversibly. The pH changes affect the surface potential, and that can be quickly determined by pulsed photovoltage measurements. To detect other species, chemically sensitive films were deposited on the SiN where the presence of the chosen analyte results in pH changes through chemical reactions. A invention of a cell sorting device based on these principles is also described. A new method of immobilizing enzymes using Sandia`s sol-gel glasses is documented and biosensors based on the silicon wafer and an amperometric technique are detailed.

More Details

The final LDRD report for the project entitled: {open_quotes}Enhanced analysis of complex gas mixtures by pattern recognition of microsensor array signals{close_quotes}

Hughes, Robert C.

Microsensors do not have the selectivity to chemical species available in large laboratory instruments. This project employed arrays of catalytically gated silicon microsensors with different catalysts to create data streams which can be analyzed by pattern recognition programs. One of the most significant accomplishments of the program was the demonstration of that mixtures of H{sub 2} with the oxidants NO{sub x} and O{sub 2} could distinguished from one another by the use of different catalytic metals on the Sandia Robust Hydrogen (SRH) sensors and the newly developed pattern recognition algorithm. This sensor system could be used to identify explosive gas mixtures and analyze exhaust streams for pollution control.

More Details

MPSalsa a finite element computer program for reacting flow problems. Part 2 - user`s guide

Salinger, Andrew G.

This manual describes the use of MPSalsa, an unstructured finite element (FE) code for solving chemically reacting flow problems on massively parallel computers. MPSalsa has been written to enable the rigorous modeling of the complex geometry and physics found in engineering systems that exhibit coupled fluid flow, heat transfer, mass transfer, and detailed reactions. In addition, considerable effort has been made to ensure that the code makes efficient use of the computational resources of massively parallel (MP), distributed memory architectures in a way that is nearly transparent to the user. The result is the ability to simultaneously model both three-dimensional geometries and flow as well as detailed reaction chemistry in a timely manner on MT computers, an ability we believe to be unique. MPSalsa has been designed to allow the experienced researcher considerable flexibility in modeling a system. Any combination of the momentum equations, energy balance, and an arbitrary number of species mass balances can be solved. The physical and transport properties can be specified as constants, as functions, or taken from the Chemkin library and associated database. Any of the standard set of boundary conditions and source terms can be adapted by writing user functions, for which templates and examples exist.

More Details

Geology of the USW SD-7 drill hole Yucca Mountain, Nevada

Rautman, Christopher A.

The USW SD-7 drill hole is one of several holes drilled under Site Characterization Plan Study 8.3.1.4.3.1, also known as the Systematic Drilling Program, as part of the U.S. Department of Energy characterization program at Yucca Mountain, Nevada. The Yucca Mountain site has been proposed as the potential location of a repository for high-level nuclear waste. The SD-7 drill hole is located near the southern end of the potential repository area and immediately to the west of the Main Test Level drift of the Exploratory Studies Facility. The hole is not far from the junction of the Main Test Level drift and the proposed South Ramp decline. Drill hole USW SD-7 is 2675.1 ft (815.3 m) deep, and the core recovered nearly complete sections of ash-flow tuffs belonging to the lower half of the Tiva Canyon Tuff, the Pah Canyon Tuff, and the Topopah Spring Tuff, all of which are part of the Miocene Paintbrush Group. Core was recovered from much of the underlying Calico Hills Formation, and core was virtually continuous in the Prow Pass Tuff and the Bullfrog Tuff. The SD-7 drill hole penetrated the top several tens of feet into the Tram Tuff, which underlies the Prow Pass and Bullfrog Tuffs. These latter three units are all formations of the Crater Flat Group, The drill hole was collared in welded materials assigned to the crystal-poor middle nonlithophysal zone of the Tiva Canyon Tuff; approximately 280 ft (85 m) of this ash-flow sheet was penetrated by the hole. The Yucca Mountain Tuff appears to be missing from the section at the USW SD-7 location, and the Pah Canyon Tuff is only 14.5 ft thick. The Pah Canyon Tuff was not recovered in core because of drilling difficulties, suggesting that the unit is entirely nonwelded. The presence of this unit is inferred through interpretation of down-hole geophysical logs.

More Details

Monolithically integrated active waveguides and lasers using rare-earth doped spin-on glass

Ashby, C.I.H.; Sullivan, C.T.; Vawter, G.A.

This LDRD program No. 3505.230 explored a new approach to monolithic integration of active waveguides and rare-earth solid state lasers directly onto III-V substrates. It involved selectively incorporating rare-earth ions into spin-on glasses (SOGs) that could be solvent cast and then patterned with conventional microelectronic processing. The patterned, rare-earth spin-on glasses (RESOGs) were to be photopumped by laser diodes prefabricated on the wafer and would serve as directly integrated active waveguides and/or rare-earth solid state lasers.

More Details

Final project report: High energy rotor development, test and evaluation

Sutherland, Herbert J.

Under the auspices of the {open_quotes}Government/Industry Wind Technology Applications Project{close_quotes} [{open_quotes}Letter of Interest{close_quotes} (LOI) Number RC-1-11101], Flo Wind Corp. has successfully developed, tested, and delivered a high-energy rotor upgrade candidate for their 19-meter Vertical Axis Wind Turbine. The project included the demonstration of the innovative extended height-to-diameter ratio concept, the development of a continuous span single-piece composite blade, the demonstration of a continuous blade manufacturing technique, the utilization of the Sandia National Laboratories developed SNLA 2150 natural laminar flow airfoil and the reuse of existing wind turbine and wind power plant infrastructure.

More Details

PHASER user`s manual version 2.10

Roginski, R.J.

PHASER (Probabilistic Hybrid Analytical System Evaluation Routine) is a computer code for solving the top event probability of a system fault tree. It has the capability for easy migration of individual basic event probabilities from a zero-{open_quotes}scale{close_quotes}-factor (completely subjective) state to one in which the analyst has total knowledge (completely stochastic) about each basic event. The code implements a fuzzy algebra solution for subjective data, a probabilistic solution for stochastic data, and a hybrid mathematics solution for data that are partly subjective and partly stochastic. Events that are not completely subjective or completely stochastic are hybrid events and are internally handled as such. The stochastic and fuzzy ranges of uncertainty in the top event probability are also computed for the analyst. These are provided in the form of a fuzzy function for the subjective uncertainty, a probability density function (PDF) for the stochastic variability, and the overall {open_quotes}confidence{close_quotes} factors for the two constituents of uncertainty, giving a complete hybrid result. PHASER interfaces with other Sandia codes (SABLE, LHS and LHSPOST) to assist the user in determining cutsets, and to compute probability density functions.

More Details

Flow calculations for Yucca Mountain groundwater travel time (GWTT-95)

Altman, Susan J.

In 1983, high-level radioactive waste repository performance requirements related to groundwater travel time were defined by NRC subsystem regulation 10 CFR 60.113. Although DOE is not presently attempting to demonstrate compliance with that regulation, understanding of the prevalence of fast paths in the groundwater flow system remains a critical element of any safety analyses for a potential repository system at Yucca Mountain, Nevada. Therefore, this analysis was performed to allow comparison of fast-path flow against the criteria set forth in the regulation. Models developed to describe the conditions for initiation, propagation, and sustainability of rapid groundwater movement in both the unsaturated and saturated zones will form part of the technical basis for total- system analyses to assess site viability and site licensability. One of the most significant findings is that the fastest travel times in both unsaturated and saturated zones are in the southern portion of the potential repository, so it is recommended that site characterization studies concentrate on this area. Results support the assumptions regarding the importance of an appropriate conceptual model of groundwater flow and the incorporation of heterogeneous material properties into the analyses. Groundwater travel times are sensitive to variation/uncertainty in hydrologic parameters and in infiltration flux at upper boundary of the problem domain. Simulated travel times are also sensitive to poorly constrained parameters of the interaction between flow in fractures and in the matrix.

More Details

Parallel processing ITS

Fan, Wesley C.

This report provides a users` guide for parallel processing ITS on a UNIX workstation network, a shared-memory multiprocessor or a massively-parallel processor. The parallelized version of ITS is based on a master/slave model with message passing. Parallel issues such as random number generation, load balancing, and communication software are briefly discussed. Timing results for example problems are presented for demonstration purposes.

More Details

Frictional sliding in layered rock: laboratory-scale experiments

Buescher, B.J.; Perry Jr., K.E.; Epstein, J.S.

The work is part of the rock mechanics effort for the Yucca Mountain Site Characterization Program. The laboratory-scale experiments are intended to provide high quality data on the mechanical behavior of jointed structures that can be used to validate complex numerical models for rock-mass behavior. Frictional sliding between simulated rock joints was studied using phase shifting moire interferometry. A model, constructed from stacks of machined and sandblasted granite plates, contained a central hole bore normal to the place so that frictional slip would be induced between the plates near the hole under compressive loading. Results show a clear evolution of slip with increasing load. Since the rock was not cycled through loading- unloading, the quantitative differences between the three data sets are probably due to a ``wearing-in`` effect. The highly variable spatial frequency of the data is probably due to the large grain size of the granite and the stochastic frictional processes. An unusual feature of the evolution of slip with increasing load is that as the load gets larger, some plates seem to return to a null position. Figs, 6 refs.

More Details

Near-surface velocity modeling at Yucca Mountain using borehole and surface records from underground nuclear explosions

Walck, Marianne C.

The Department of Energy is investigating Yucca Mountain, Nevada as a potential site for commercial radioactive waste disposal in a mined geologic repository. One critical aspect of site suitability is the tectonic stability of the repository site. The levels of risk from both actual fault displacements in the repository block and ground shaking from nearby earthquakes are being examined. In particular, it is necessary to determine the expected level of ground shaking at the repository depth for large seismic sources such as nearby large earthquakes or underground nuclear explosions (UNEs). Earthquakes are expected to cause the largest ground motions at the site, however, only underground nuclear explosion data have been obtained at the repository depth level (about 350m below the ground level) to date. In this study we investigate ground motion from Nevada Test Site underground nuclear explosions recorded at Yucca Mountain to establish a compressional velocity model for the uppermost 350m of the mountain. This model is useful for prediction of repository-level ground motions for potential large nearby earthquakes.

More Details

Do-It-Now building maintenance reengineering project

Kadlec, J.

The Do-It-Now (DIN) building maintenance system is proposed to reduce the cost of routine building maintenance and repairs and to improve customer satisfaction with maintenance services. DIN uses a team approach to periodically inspect buildings and provide maintenance services on the spot. It emphasizes communications between the customers and the craftspeople performing the work. The system was designed using a reengineering approach that characterized the existing maintenance work control system, analyzed comparable systems in other DOE laboratories, envisioned an ideal system, and proposed a workable, testable system for initial implementation. At each stage, input was solicited from customer representatives and Facilities management to ensure meeting customer requirements with an implementable system.

More Details

Estimation of the carbon monoxide emissions due to Sandia National Laboratories commuter and on-base traffic for conformity determination

Mcclellan, Yvonne

This report describes the analysis and conclusion of an investigation of the carbon monoxide emissions resulting from Sandia National Laboratories and Department of Energy (DOE) commuter and on-base traffic for the Clean Air Act (CAA) Conformity Determination. Albuquerque/Bernalillo County was classified as a nonattainment area by the Environmental Protection Agency. Nonattainment area is an area which is shown by monitored data or which is calculated by air quality modeling to exceed any National Ambient Air Quality Standard (NAAQS) for the pollutant. Albuquerque/Bernalillo County exceeds the NAAQS for carbon monoxide and ozone. The Conformity Determination was needed to complete the CAA Title V Permitting process for SNL and the DOE. The analysis used the EPA approved MOBILE5a Carbon Monoxide (CO) emissions modeling program. This analysis will provide a baseline for mobile sources to allow Sandia to estimate any future activity and how that activity will impact CO emissions. The General Conformity Rule (AQCR 43) requires that operations which will increase CO emissions in nonattaimnent or maintenance areas such as Bernalillo County undergo conformity analyses to determine whether or not they will impact ambient air quality in the area.

More Details

Use of depleted uranium metal as cask shielding in high-level waste storage, transport, and disposal systems

Yoshimura, Richard H.

The US DOE has amassed over 555,000 metric tons of depleted uranium from its uranium enrichment operations. Rather than dispose of this depleted uranium as waste, this study explores a beneficial use of depleted uranium as metal shielding in casks designed to contain canisters of vitrified high-level waste. Two high-level waste storage, transport, and disposal shielded cask systems are analyzed. The first system employs a shielded storage and disposal cask having a separate reusable transportation overpack. The second system employs a shielded combined storage, transport, and disposal cask. Conceptual cask designs that hold 1, 3, 4 and 7 high-level waste canisters are described for both systems. In all cases, cask design feasibility was established and analyses indicate that these casks meet applicable thermal, structural, shielding, and contact-handled requirements. Depleted uranium metal casting, fabrication, environmental, and radiation compatibility considerations are discussed and found to pose no serious implementation problems. About one-fourth of the depleted uranium inventory would be used to produce the casks required to store and dispose of the nearly 15,400 high-level waste canisters that would be produced. This study estimates the total-system cost for the preferred 7-canister storage and disposal configuration having a separate transportation overpack would be $6.3 billion. When credits are taken for depleted uranium disposal cost, a cost that would be avoided if depleted uranium were used as cask shielding material rather than disposed of as waste, total system net costs are between $3.8 billion and $5.5 billion.

More Details

Bulk and mechanical properties of the Paintbrush tuff recovered from boreholes UE25 NRG-2, 2A, 2B, and 3: Data report

Price, Ronald H.

An integral part of the licensing procedure for the potential nuclear waste repository at Yucca Mountain, Nevada, involves characterization of the in situ rheology for the design and construction of the facility and the emplacement of canisters containing radioactive waste. The data used to model the thermal and mechanical behavior of the repository and surrounding lithologies include dry and saturated bulk densities, average grain density, porosity, compressional and shear wave velocities, elastic moduli, and compressional and tensional fracture strengths. In this study, a suite of experiments was performed on cores recovered from boreholes UE25 NRG-2, 2A, 2B, and 3 drilled in support of the Exploratory Studies Facility (ESF) at Yucca Mountain. The holes penetrated the Timber Mountain tuff and two thermal/mechanical units of the Paintbrush tuff. The thermal/mechanical stratigraphy was defined by Ortiz to group rock horizons of similar properties for the purpose of simplifying modeling efforts. The relationship between the geologic stratigraphy and the thermal/mechanical stratigraphy for each borehole is presented. The tuff samples in this study have a wide range of welding characteristics (usually reflected in sample porosity), and a smaller range of mineralogy and petrology characteristics. Generally, the samples are silicic, ash-fall tuffs that exhibit large variability in their elastic and strength properties.

More Details

Bulk and mechanical properties of the Paintbrush tuff recovered from boreholes UE25 NRG-4 and -5: Data report

Price, Ronald H.

Experimental results are presented for bulk and mechanical properties measurements on specimens of the Paintbrush tuff recovered from boreholes UE25 NRG-4 and -5, at Yucca Mountain, Nevada. Measurements have been performed on three thermal/mechanical units, PTn, TSwl, and TSw2. On each specimen the following bulk properties have been reported: dry bulk density, saturated bulk density, average grain density, and porosity. Unconfined compression to failure, confined compression to failure, and indirect tensile strength tests were performed on selected specimens recovered from the boreholes. In addition, compressional and shear wave velocities were measured on specimens designated for unconfined compression and confined compression experiments. Measurements were conducted at room temperature on nominally water-saturated specimens. The nominal rate for the fracture experiments was 10{sup -5}s{sup -1}.

More Details

Environmental assessment for Sandia National Laboratories/New Mexico offsite transportation of low-level radioactive waste

Lucas, M.

Sandia National Laboratories, New Mexico (SNL/NM) is managed and operated by Sandia Corporation, a Lockheed Martin Company. SNL/NM is located on land owned by the U.S. Department of Energy (DOE) within the boundaries of the Kirtland Air Force Base (KAFB) in Albuquerque, New Mexico. The major responsibilities of SNL/NM are the support of national security and energy projects. Low-level radioactive waste (LLW) is generated by some of the activities performed at SNL/NM in support of the DOE. This report describes potential environmental effects of the shipments of low-level radioactive wastes to other sites.

More Details

East Mountain Area 1995 air sampling results

Deola, Regina A.

Ambient air samples were taken at two locations in the East Mountain Area in conjunction with thermal testing at the Lurance Canyon Burn Site (LCBS). The samples were taken to provide measurements of particulate matter with a diameter less than or equal to 10 micrometers (PM{sub 10}) and volatile organic compounds (VOCs). This report summarizes the results of the sampling performed in 1995. The results from small-scale testing performed to determine the potentially produced air pollutants in the thermal tests are included in this report. Analytical results indicate few samples produced measurable concentrations of pollutants believed to be produced by thermal testing. Recommendations for future air sampling in the East Mountain Area are also noted.

More Details

Hot oiling spreadsheet

Mansure, Arthur J.

One of the most common oil-field treatments is hot oiling to remove paraffin from wells. Even though the practice is common, the thermal effectiveness of the process is not commonly understood. In order for producers to easily understand the thermodynamics of hot oiling, a simple tool is needed for estimating downhole temperatures. Such a tool has been developed that was distributed as a compiled, public-domain-software spreadsheet. That spreadsheet has evolved into an interactive from on the World Wide Web and has been adapted into a Windows{trademark} program by Petrolite, St. Louis MO. The development of such a tools was facilitated by expressing downhole temperatures in terms of analytic formulas. Considerable algebraic work is required to develop such formulas. Also, the data describing hot oiling is customarily a mixture of practical units that must be converted to a consistent set of units. To facilitate the algebraic manipulations and to assure unit conversions are correct, during development parallel calculations were made using the spreadsheet and a symbolic mathematics program. Derivation of the formulas considered falling film flow in the annulus and started from the transient differential equations so that the effects of the heat capacity of the tubing and casing could be included. While this approach to developing a software product does not have the power and sophistication of a finite element or difference code, it produces a user friendly product that implements the equations solved with a minimum potential for bugs. This allows emphasis in development of the product to be placed on the physics.

More Details

Lattice and off-lattice side chain models of protein folding: Linear time structure prediction better than 86% of optimal

Hart, William E.

This paper considers the protein structure prediction problem for lattice and off-lattice protein folding models that explicitly represent side chains. Lattice models of proteins have proven extremely useful tools for reasoning about protein folding in unrestricted continuous space through analogy. This paper provides the first illustration of how rigorous algorithmic analyses of lattice models can lead to rigorous algorithmic analyses of off-lattice models. The authors consider two side chain models: a lattice model that generalizes the HP model (Dill 85) to explicitly represent side chains on the cubic lattice, and a new off-lattice model, the HP Tangent Spheres Side Chain model (HP-TSSC), that generalizes this model further by representing the backbone and side chains of proteins with tangent spheres. They describe algorithms for both of these models with mathematically guaranteed error bounds. In particular, the authors describe a linear time performance guaranteed approximation algorithm for the HP side chain model that constructs conformations whose energy is better than 865 of optimal in a face centered cubic lattice, and they demonstrate how this provides a 70% performance guarantee for the HP-TSSC model. This is the first algorithm in the literature for off-lattice protein structure prediction that has a rigorous performance guarantee. The analysis of the HP-TSSC model builds off of the work of Dancik and Hannenhalli who have developed a 16/30 approximation algorithm for the HP model on the hexagonal close packed lattice. Further, the analysis provides a mathematical methodology for transferring performance guarantees on lattices to off-lattice models. These results partially answer the open question of Karplus et al. concerning the complexity of protein folding models that include side chains.

More Details
Results 94576–94600 of 99,299
Results 94576–94600 of 99,299