Publications

Results 8776–8800 of 9,998

Search results

Jump to search filters

Parallel job scheduling policies to improve fairness : a case study

Leung, Vitus J.

Balancing fairness, user performance, and system performance is a critical concern when developing and installing parallel schedulers. Sandia uses a customized scheduler to manage many of their parallel machines. A primary function of the scheduler is to ensure that the machines have good utilization and that users are treated in a 'fair' manner. A separate compute process allocator (CPA) ensures that the jobs on the machines are not too fragmented in order to maximize throughput. Until recently, there has been no established technique to measure the fairness of parallel job schedulers. This paper introduces a 'hybrid' fairness metric that is similar to recently proposed metrics. The metric uses the Sandia version of a 'fairshare' queuing priority as the basis for fairness. The hybrid fairness metric is used to evaluate a Sandia workload. Using these results, multiple scheduling strategies are introduced to improve performance while satisfying user and system performance constraints.

More Details

ALEGRA: An arbitrary Lagrangian-Eulerian multimaterial, multiphysics code

46th AIAA Aerospace Sciences Meeting and Exhibit

Robinson, Allen C.; Brunner, Thomas A.; Carroll, Susan; Richarddrake; Garasi, Christopher J.; Gardiner, Thomas; Haill, Thomas; Hanshaw, Heath; Hensinger, David; Labreche, Duane; Lemke, Raymond; Love, Edward; Luchini, Christopher; Mosso, Stewart; Niederhaus, John; Ober, Curtis C.; Petney, Sharon; Rider, William J.; Scovazzi, Guglielmo; Strack, O.E.; Summers, Randall; Trucano, Timothy; Weirs, V.G.; Wong, Michael; Voth, Thomas

ALEGRA is an arbitrary Lagrangian-Eulerian (multiphysics) computer code developed at Sandia National Laboratories since 1990. The code contains a variety of physics options including magnetics, radiation, and multimaterial flow. The code has been developed for nearly two decades, but recent work has dramatically improved the code's accuracy and robustness. These improvements include techniques applied to the basic Lagrangian differencing, artificial viscosity and the remap step of the method including an important improvement in the basic conservation of energy in the scheme. We will discuss the various algorithmic improvements and their impact on the results for important applications. Included in these applications are magnetic implosions, ceramic fracture modeling, and electromagnetic launch. Copyright © 2008 by the American Institute of Aeronautics and Astronautics, Inc.

More Details

A mesh optimization algorithm to decrease the maximum error in finite element computations

Proceedings of the 17th International Meshing Roundtable, IMR 2008

Hetmaniuk, U.; Knupp, Patrick K.

We present a mesh optimization algorithm for adaptively improving the finite element interpolation of a function of interest. The algorithm minimizes an objective function by swapping edges and moving nodes. Numerical experiments are performed on model problems. The results illustrate that the mesh optimization algorithm can reduce the W1,∞ semi-norm of the interpolation error. For these examples, the L2, L∞, and H1 norms decreased also.

More Details

A unified architecture for cognition and motor control based on neuroanatomy, psychophysical experiments, and cognitive behaviors

AAAI Fall Symposium - Technical Report

Rohrer, Brandon R.

A Brain-Emulating Cognition and Control Architecture (BECCA) is presented. It is consistent with the hypothesized functions of pervasive intra-cortical and cortico-subcortical neural circuits. It is able to reproduce many salient aspects of human voluntary movement and motor learning. It also provides plausible mechanisms for many phenomena described in cognitive psychology, including perception and mental modeling. Both "inputs" (afferent channels) and "outputs"' (efferent channels) are treated as neural signals; they are all binary (either on or off) and there is no meaning, information, or tag associated with any of them. Although BECCA initially has no internal models, it learns complex interrelations between outputs and inputs through which it bootstraps a model of the system it is controlling and the outside world. BECCA uses two key algorithms to accomplish this: S-Learning and Context-Based Similarity (CBS).

More Details

Individual and group electronic brainstorming in an industrial setting

Proceedings of the Human Factors and Ergonomics Society

Dornburg, Courtney C.; Hendrickson, Stacey M.; Davidson, George S.

An experiment was conducted comparing the effectiveness of individual versus group electronic brainstorming in addressing real-world "wickedly difficult" challenges. Previous laboratory research has engaged small groups of students in answering questions irrelevant to an industrial setting. The current experiment extended this research to larger, real-world employee groups engaged in addressing organizationrelevant challenges. Within the present experiment, the data demonstrated that individuals performed at least as well as groups in terms of number of ideas produced and significantly (p<.02) outperformed groups in terms of the quality of those ideas (as measured along the dimensions of originality, feasibility, and effectiveness).

More Details

Understanding virulence mechanisms in M. tuberculosis infection via a circuit-based simulation framework

Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS'08 - "Personalized Healthcare through Technology"

May, Elebeoba; Leitao, Andrei; Faulon, Jean-Loup M.; Joo, Jaewook J.; Misra, Milind; Oprea, Tudor I.

Tuberculosis (TB), caused by the bacterium Mycobacterium tuberculosis (Mtb), is a growing international health crisis. Mtb is able to persist in host tissues in a nonreplicating persistent (NRP) or latent state. This presents a challenge in the treatment of TB. Latent TB can re-activate in 10% of individuals with normal immune systems, higher for those with compromised immune systems. A quantitative understanding of latency-associated virulence mechanisms may help researchers develop more effective methods to battle the spread and reduce TB associated fatalities. Leveraging BioXyce's ability to simulate whole-cell and multi-cellular systems we are developing a circuit-based framework to investigate the impact of pathogenicity-associated pathways on the latency/reactivation phase of tuberculosis infection. We discuss efforts to simulate metabolic pathways that potentially impact the ability of Mtb to persist within host immune cells. We demonstrate how simulation studies can provide insight regarding the efficacy of potential anti-TB agents on biological networks critical to Mtb pathogenicity using a systems chemical biology approach. © 2008 IEEE.

More Details

Model calibration under uncertainty: Matching distribution information

12th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, MAO

Swiler, Laura P.; Adams, Brian M.; Eldred, Michael

We develop an approach for estimating model parameters which result in the "best distribution fit" between experimental and simulation data. Best distribution fit means matching moments of experimental data to those of a simulation (and possibly matching a full probability distribution). This approach extends typical nonlinear least squares methods which identify parameters maximizing agreement between experimental points and computational simulation results. Several analytic formulations for the distribution matching problem are provided, along with results for solving test problems and comparisons of this parameter estimation technique with a deterministic least squares approach. Copyright © 2008 by the American Institute of Aeronautics and Astronautics, Inc.

More Details
Results 8776–8800 of 9,998
Results 8776–8800 of 9,998