Publications

Results 48301–48400 of 99,299

Search results

Jump to search filters

U.S. Department of Energy NESHAP Annual Report for CY 2014 Sandia National Laboratories Tonopah Test Range

Evelo, Stacie; Miller, Mark L.

This National Emission Standards for Hazardous Air Pollutants (NESHAP) Annual Report has been prepared in a format to comply with the reporting requirements of 40 CFR 61.94 and the April 5, 1995 Memorandum of Agreement (MOA) between the Department of Energy (DOE) and the Environmental Protection Agency (EPA). According to the EPA approved NESHAP Monitoring Plan for the Tonopah Test Range (TTR), 40 CFR 61, subpart H, and the MOA, no additional monitoring or measurements are required at TTR in order to demonstrate compliance with the NESHAP regulation.

More Details

Supervised Gamma Process Poisson Factorization

Anderson, Dylan Z.

This thesis develops the supervised gamma process Poisson factorization (S- GPPF) framework, a novel supervised topic model for joint modeling of count matrices and document labels. S-GPPF is fully generative and nonparametric: document labels and count matrices are modeled under a uni ed probabilistic framework and the number of latent topics is controlled automatically via a gamma process prior. The framework provides for multi-class classification of documents using a generative max-margin classifier. Several recent data augmentation techniques are leveraged to provide for exact inference using a Gibbs sampling scheme. The first portion of this thesis reviews supervised topic modeling and several key mathematical devices used in the formulation of S-GPPF. The thesis then introduces the S-GPPF generative model and derives the conditional posterior distributions of the latent variables for posterior inference via Gibbs sampling. The S-GPPF is shown to exhibit state-of-the-art performance for joint topic modeling and document classification on a dataset of conference abstracts, beating out competing supervised topic models. The unique properties of S-GPPF along with its competitive performance make it a novel contribution to supervised topic modeling.

More Details

The CPAT Domain Model - How CPAT "Thinks" from an Analyst Perspective

Melander, Darryl; Henry, Stephen M.; Hoffman, Matthew; Kao, Gio K.; Lawton, Craig; Muldoon, Frank M.; Shelton, Liliana

To help effectively plan the management and modernization of its large and diverse fleet of vehicles, the Program Executive Office Ground Combat Systems (PEO GCS) commissioned the development of a large-scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This report contains a description of the organizational fleet structure and a thorough explanation of the business rules that the CPAT formulation follows involving performance, scheduling, production, and budgets.

More Details

United States Department of Energy National Nuclear Security Administration Sandia Field Office NESHAP Annual Report CY2014 for Sandia National Laboratories New Mexico

Evelo, Stacie; Miller, Mark L.

This report provides a summary of the radionuclide releases from the United States (U.S.) Department of Energy (DOE) National Nuclear Security Administration facilities at Sandia National Laboratories, New Mexico (SNL/NM) during Calendar Year (CY) 2014, including the data, calculations, and supporting documentation for demonstrating compliance with 40 Code of Federal Regulation (CFR) 61, Subpart H--NATIONAL EMISSION STANDARDS FOR EMISSIONS OF RADIONUCLIDES OTHER THAN RADON FROM DEPARTMENT OF ENERGY FACILITIES. A description is given of the sources and their contributions to the overall dose assessment. In addition, the maximally exposed individual (MEI) radiological dose calculation and the population dose to local and regional residents are discussed.

More Details

Next-generation Algorithms for Assessing Infrastructure Vulnerability and Optimizing System Resilience

Burchett, Deon L.; Chen, Richard L.Y.; Phillips, Cynthia A.; Richard, Jean-Philippe

This report summarizes the work performed under the project project Next-Generation Algorithms for Assessing Infrastructure Vulnerability and Optimizing System Resilience. The goal of the project was to improve mathematical programming-based optimization technology for infrastructure protection. In general, the owner of a network wishes to design a network a network that can perform well when certain transportation channels are inhibited (e.g. destroyed) by an adversary. These are typically bi-level problems where the owner designs a system, an adversary optimally attacks it, and then the owner can recover by optimally using the remaining network. This project funded three years of Deon Burchett's graduate research. Deon's graduate advisor, Professor Jean-Philippe Richard, and his Sandia advisors, Richard Chen and Cynthia Phillips, supported Deon on other funds or volunteer time. This report is, therefore. essentially a replication of the Ph.D. dissertation it funded in a format required for project documentation. The thesis had some general polyhedral research. This is the study of the structure of the feasible region of mathematical programs, such as integer programs. For example, an integer program optimizes a linear objective function subject to linear constraints, and (nonlinear) integrality constraints on the variables. The feasible region without the integrality constraints is a convex polygon. Careful study of additional valid constraints can significantly improve computational performance. Here is the abstract from the dissertation: We perform a polyhedral study of a multi-commodity generalization of variable upper bound flow models. In particular, we establish some relations between facets of single- and multi- commodity models. We then introduce a new family of inequalities, which generalizes traditional flow cover inequalities to the multi-commodity context. We present encouraging numerical results. We also consider the directed edge-failure resilient network design problem (DRNDP). This problem entails the design of a directed multi-commodity flow network that is capable of fulfilling a specified percentage of demands in the event that any G arcs are destroyed, where G is a constant parameter. We present a formulation of DRNDP and solve it in a branch-column-cut framework. We present computational results.

More Details

Cyber Graph Queries for Geographically Distributed Data Centers

Berry, Jonathan; Collins, Michael; Kearns, Aaron; Phillips, Cynthia A.; Saia, Jared

We present new algorithms for a distributed model for graph computations motivated by limited information sharing we first discussed. Two or more independent entities have collected large social graphs. They wish to compute the result of running graph algorithms on the entire set of relationships. Because the information is sensitive or economically valuable, they do not wish to simply combine the information in a single location. We consider two models for computing the solution to graph algorithms in this setting: 1) limited-sharing: the two entities can share only a polylogarithmic size subgraph; 2) low-trust: the entities must not reveal any information beyond the query answer, assuming they are all honest but curious. We believe this model captures realistic constraints on cooperating autonomous data centers. We have algorithms in both setting for s - t connectivity in both models. We also give an algorithm in the low-communication model for finding a planted clique. This is an anomaly-detection problem, finding a subgraph that is larger and denser than expected. For both the low- communication algorithms, we exploit structural properties of social networks to prove performance bounds better than what is possible for general graphs. For s - t connectivity, we use known properties. For planted clique, we propose a new property: bounded number of triangles per node. This property is based upon evidence from the social science literature. We found that classic examples of social networks do not have the bounded-triangles property. This is because many social networks contain elements that are non-human, such as accounts for a business, or other automated accounts. We describe some initial attempts to distinguish human nodes from automated nodes in social networks based only on topological properties.

More Details
Results 48301–48400 of 99,299
Results 48301–48400 of 99,299