Publications

Results 76–98 of 98
Skip to search filters

Generalized hypergraph matching via iterated packing and local ratio

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Parekh, Ojas D.; Pritchard, David

In k-hypergraph matching, we are given a collection of sets of size at most k, each with an associated weight, and we seek a maximumweight subcollection whose sets are pairwise disjoint. More generally, in k-hypergraph b-matching, instead of disjointness we require that every element appears in at most b sets of the subcollection. Our main result is a linear-programming based (k - 1 + 1/k)-approximation algorithm for k-hypergraph b-matching. This settles the integrality gap when k is one more than a prime power, since it matches a previously-known lower bound. When the hypergraph is bipartite, we are able to improve the approximation ratio to k - 1, which is also best possible relative to the natural LP. These results are obtained using a more careful application of the iterated packing method. Using the bipartite algorithmic integrality gap upper bound, we show that for the family of combinatorial auctions in which anyone can win at most t items, there is a truthful-in-expectation polynomial-time auction that t-approximately maximizes social welfare. We also show that our results directly imply new approximations for a generalization of the recently introduced bounded-color matching problem. We also consider the generalization of b-matching to demand matching, where edges have nonuniform demand values. The best known approximation algorithm for this problem has ratio 2k on k-hypergraphs. We give a new algorithm, based on local ratio, that obtains the same approximation ratio in a much simpler way.

More Details

Geometric hitting set for segments of few orientations

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Fekete, Sándor P.; Huang, Kan; Mitchell, Joseph S.B.; Parekh, Ojas D.; Phillips, Cynthia A.

We study several natural instances of the geometric hitting set problem for input consisting of sets of line segments (and rays, lines) having a small number of distinct slopes. These problems model path monitoring (e.g., on road networks) using the fewest sensors (the “hitting points”). We give approximation algorithms for cases including (i) lines of 3 slopes in the plane, (ii) vertical lines and horizontal segments, (iii) pairs of horizontal/vertical segments. We give hardness and hardness of approximation results for these problems. We prove that the hitting set problem for vertical lines and horizontal rays is polynomially solvable.

More Details

A computational framework for ontologically storing and analyzing very large overhead image sets

Proceedings of the 3rd ACM SIGSPATIAL International Workshop on Analytics for Big Geospatial Data, BigSpatial 2014

Brost, Randolph B.; Rintoul, Mark D.; McLendon, William C.; Strip, David R.; Parekh, Ojas D.; Woodbridge, Diane W.

We describe a computational approach to remote sensing image analysis that addresses many of the classic problems associated with storage, search, and query. This process starts by automatically annotating the fundamental objects in the image data set that will be used as a basis for an ontology, including both the objects (such as building, road, water, etc.) and their spatial and temporal relationships (is within 100 m of, is surrounded by, has changed in the past year, etc.). Data sets that can include multiple time slices of the same area are then processed using automated tools that reduce the images to the objects and relationships defined in an ontology based on the primitive objects, and this representation is stored in a geospatial-temporal semantic graph. Image searches are then defined in terms of the ontology (e.g. find a building greater than 103 m2 that borders a body of water), and the graph is searched for such relationships. This approach also enables the incorporation of non-image data that is related to the ontology. We demonstrate through an initial implementation of the entire system on large data sets (109 - 1011 pixels) that this system is robust against variations in di?erent image collection parameters, provides a way for analysts to query data sets in a more natural way, and can greatly reduce the memory footprint of the search.

More Details

Encoding and analyzing aerial imagery using geospatial semantic graphs

Rintoul, Mark D.; Watson, Jean-Paul W.; McLendon, William C.; Parekh, Ojas D.

While collection capabilities have yielded an ever-increasing volume of aerial imagery, analytic techniques for identifying patterns in and extracting relevant information from this data have seriously lagged. The vast majority of imagery is never examined, due to a combination of the limited bandwidth of human analysts and limitations of existing analysis tools. In this report, we describe an alternative, novel approach to both encoding and analyzing aerial imagery, using the concept of a geospatial semantic graph. The advantages of our approach are twofold. First, intuitive templates can be easily specified in terms of the domain language in which an analyst converses. These templates can be used to automatically and efficiently search large graph databases, for specific patterns of interest. Second, unsupervised machine learning techniques can be applied to automatically identify patterns in the graph databases, exposing recurring motifs in imagery. We illustrate our approach using real-world data for Anne Arundel County, Maryland, and compare the performance of our approach to that of an expert human analyst.

More Details

Evaluating Near-Term Adiabatic Quantum Computing

Parekh, Ojas D.; Aidun, John B.; Dubicka, Irene D.; Landahl, Andrew J.; Shulenburger, Luke N.; Tigges, Chris P.; Wendt, Jeremy D.

This report summarizes the first year’s effort on the Enceladus project, under which Sandia was asked to evaluate the potential advantages of adiabatic quantum computing for analyzing large data sets in the near future, 5-to-10 years from now. We were not specifically evaluating the machine being sold by D-Wave Systems, Inc; we were asked to anticipate what future adiabatic quantum computers might be able to achieve. While realizing that the greatest potential anticipated from quantum computation is still far into the future, a special purpose quantum computing capability, Adiabatic Quantum Optimization (AQO), is under active development and is maturing relatively rapidly; indeed, D-Wave Systems Inc. already offers an AQO device based on superconducting flux qubits. The AQO architecture solves a particular class of problem, namely unconstrained quadratic Boolean optimization. Problems in this class include many interesting and important instances. Because of this, further investigation is warranted into the range of applicability of this class of problem for addressing challenges of analyzing big data sets and the effectiveness of AQO devices to perform specific analyses on big data. Further, it is of interest to also consider the potential effectiveness of anticipated special purpose adiabatic quantum computers (AQCs), in general, for accelerating the analysis of big data sets. The objective of the present investigation is an evaluation of the potential of AQC to benefit analysis of big data problems in the next five to ten years, with our main focus being on AQO because of its relative maturity. We are not specifically assessing the efficacy of the D-Wave computing systems, though we do hope to perform some experimental calculations on that device in the sequel to this project, at least to provide some data to compare with our theoretical estimates.

More Details

Optimization of large-scale heterogeneous system-of-systems models

Gray, Genetha A.; Hart, William E.; Hough, Patricia D.; Parekh, Ojas D.; Phillips, Cynthia A.; Siirola, John D.; Swiler, Laura P.; Watson, Jean-Paul W.

Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

More Details

Iterative packing for demand matching and sparse packing

Parekh, Ojas D.

The main result we will present is a 2k-approximation algorithm for the following 'k-hypergraph demand matching' problem: given a set system with sets of size <=k, where sets have profits & demands and vertices have capacities, find a max-profit subsystem whose demands do not exceed the capacities. The main tool is an iterative way to explicitly build a decomposition of the fractional optimum as 2k times a convex combination of integral solutions. If time permits we'll also show how the approach can be extended to a 3-approximation for 2-column sparse packing. The second result is tight w.r.t the integrality gap, and the first is near-tight as a gap lower bound of 2(k-1+1/k) is known.

More Details
Results 76–98 of 98
Results 76–98 of 98