Publications

Results 72776–72800 of 99,299

Search results

Jump to search filters

Bayesian methods for discontinuity detection in climate model predictions

Safta, Cosmin; Debusschere, Bert; Najm, Habib N.; Sargsyan, Khachik

Discontinuity detection is an important component in many fields: Image recognition, Digital signal processing, and Climate change research. Current methods shortcomings are: Restricted to one- or two-dimensional setting, Require uniformly spaced and/or dense input data, and Give deterministic answers without quantifying the uncertainty. Spectral methods for Uncertainty Quantification with global, smooth bases are challenged by discontinuities in model simulation results. Domain decomposition reduces the impact of nonlinearities and discontinuities. However, while gaining more smoothness in each subdomain, the current domain refinement methods require prohibitively many simulations. Therefore, detecting discontinuities up front and refining accordingly provides huge improvement to the current methodologies.

More Details

Data-free inference of uncertain model parameters

Debusschere, Bert; Najm, Habib N.; Berry, Robert D.; Adalsteinsson, Helgi

It is known that, in general, the correlation structure in the joint distribution of model parameters is critical to the uncertainty analysis of that model. Very often, however, studies in the literature only report nominal values for parameters inferred from data, along with confidence intervals for these parameters, but no details on the correlation or full joint distribution of these parameters. When neither posterior nor data are available, but only summary statistics such as nominal values and confidence intervals, a joint PDF must be chosen. Given the summary statistics it may not be reasonable nor necessary to assume the parameters are independent random variables. We demonstrate, using a Bayesian inference procedure, how to construct a posterior density for the parameters exhibiting self consistent correlations, in the absence of data, given (1) the fit-model, (2) nominal parameter values, (3) bounds on the parameters, and (4) a postulated statistical model, around the fit-model, for the missing data. Our approach ensures external Bayesian updating while marginalizing over possible data realizations. We then address the matching of given parameter bounds through the choice of hyperparameters, which are introduced in postulating the statistical model, but are not given nominal values. We discuss some possible approaches, including (1) inferring them in a separate Bayesian inference loop and (2) optimization. We also perform an empirical evaluation of the algorithm showing the posterior obtained with this data free inference compares well with the true posterior obtained from inference against the full data set.

More Details

An overview of the global threat reduction initiative's physical protection work in Tanzania

Itamura, Michael; Strosinski, Michael

The U.S. Department of Energy's (DOE) National Nuclear Security Administration (NNSA) established the Global Threat Reduction Initiative's (GTRI) mission to reduce and protect nuclear and radiological materials located at civilian sites worldwide. Internationally, over 80 countries are cooperating with GTRI to enhance security of facilities with these materials. In 2004, a GTRI delegation began working with the Tanzania Atomic Energy Commission, (TAEC). The team conducted site assessments for the physical protection of radiological materials in Tanzania. Today, GTRI and the Government of Tanzania continue cooperative efforts to enhance physical security at several radiological sites, including a central sealed-source storage facility, and sites in the cities of Arusha, Dar Es Salaam, and Tanga. This paper describes the scope of physical protection work, lessons learned, and plans for future cooperation between the GTRI program and the TAEC. Additionally the paper will review the cooperative efforts between TAEC and the International Atomic Energy Agency (IAEA) with regards to a remote monitoring system at a storage facility and to the repackaging of radioactive sources.

More Details

ParaText : scalable text modeling and analysis

Dunlavy, Daniel M.; Stanton, Eric T.

Automated processing, modeling, and analysis of unstructured text (news documents, web content, journal articles, etc.) is a key task in many data analysis and decision making applications. As data sizes grow, scalability is essential for deep analysis. In many cases, documents are modeled as term or feature vectors and latent semantic analysis (LSA) is used to model latent, or hidden, relationships between documents and terms appearing in those documents. LSA supplies conceptual organization and analysis of document collections by modeling high-dimension feature vectors in many fewer dimensions. While past work on the scalability of LSA modeling has focused on the SVD, the goal of our work is to investigate the use of distributed memory architectures for the entire text analysis process, from data ingestion to semantic modeling and analysis. ParaText is a set of software components for distributed processing, modeling, and analysis of unstructured text. The ParaText source code is available under a BSD license, as an integral part of the Titan toolkit. ParaText components are chained-together into data-parallel pipelines that are replicated across processes on distributed-memory architectures. Individual components can be replaced or rewired to explore different computational strategies and implement new functionality. ParaText functionality can be embedded in applications on any platform using the native C++ API, Python, or Java. The ParaText MPI Process provides a 'generic' text analysis pipeline in a command-line executable that can be used for many serial and parallel analysis tasks. ParaText can also be deployed as a web service accessible via a RESTful (HTTP) API. In the web service configuration, any client can access the functionality provided by ParaText using commodity protocols ... from standard web browsers to custom clients written in any language.

More Details

Integrated Modeling, Mapping, and Simulation (IMMS) framework for planning exercises

Friedman-Hill, Ernest

The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.

More Details

Emulating a million machines to investigate botnets

Rudish, Donald W.

Researchers at Sandia National Laboratories in Livermore, California are creating what is in effect a vast digital petridish able to hold one million operating systems at once in an effort to study the behavior of rogue programs known as botnets. Botnets are used extensively by malicious computer hackers to steal computing power fron Internet-connected computers. The hackers harness the stolen resources into a scattered but powerful computer that can be used to send spam, execute phishing, scams or steal digital information. These remote-controlled 'distributed computers' are difficult to observe and track. Botnets may take over parts of tens of thousands or in some cases even millions of computers, making them among the world's most powerful computers for some applications.

More Details

A highly reliable RAID system based on GPUs

Curry, Matthew L.

While RAID is the prevailing method of creating reliable secondary storage infrastructure, many users desire more flexibility than offered by current implementations. To attain needed performance, customers have often sought after hardware-based RAID solutions. This talk describes a RAID system that offloads erasure correction coding calculations to GPUs, allowing increased reliability by supporting new RAID levels while maintaining high performance.

More Details

Temperature switchable polymer dielectrics

Dirk, Shawn M.

Materials with switchable states are desirable in many areas of science and technology. The ability to thermally transform a dielectric material to a conductive state should allow for the creation of electronics with built-in safety features. Specifically, the non-desirable build-up and discharge of electricity in the event of a fire or over-heating would be averted by utilizing thermo-switchable dielectrics in the capacitors of electrical devices (preventing the capacitors from charging at elevated temperatures). We have designed a series of polymers that effectively switch from a non-conductive to a conductive state. The thermal transition is governed by the stability of the leaving group after it leaves as a free entity. Here, we present the synthesis and characterization of a series of precursor polymers that eliminate to form poly(p-phenylene vinylene) (PPV's).

More Details
Results 72776–72800 of 99,299
Results 72776–72800 of 99,299