We propose the average spectrum norm to study the minimum number of measurements required to approximate a multidimensional array (i.e., sample complexity) via low-rank tensor recovery. Our focus is on the tensor completion problem, where the aim is to estimate a multiway array using a subset of tensor entries corrupted by noise. Our average spectrum norm-based analysis provides near-optimal sample complexities, exhibiting dependence on the ambient dimensions and rank that do not suffer from exponential scaling as the order increases.
Large scale non-intrusive inspection (NII) of commercial vehicles is being adopted in the U.S. at a pace and scale that will result in a commensurate growth in adjudication burdens at land ports of entry. The use of computer vision and machine learning models to augment human operator capabilities is critical in this sector to ensure the flow of commerce and to maintain efficient and reliable security operations. The development of models for this scale and speed requires novel approaches to object detection and novel adjudication pipelines. Here we propose a notional combination of existing object detection tools using a novel ensembling framework to demonstrate the potential for hierarchical and recursive operations. Further, we explore the combination of object detection with image similarity as an adjacent capability to provide post-hoc oversight to the detection framework. The experiments described herein, while notional and intended for illustrative purposes, demonstrate that the judicious combination of diverse algorithms can result in a resilient workflow for the NII environment.
The growing x-ray detection burden for vehicles at Ports of Entry in the US requires the development of efficient and reliable algorithms to assist human operator in detecting contraband. Developing algorithms for large-scale non-intrusive inspection (NII) that both meet operational performance requirements and are extensible for use in an evolving environment requires large volumes and varieties of training data, yet collecting and labeling data for these enivornments is prohibitively costly and time consuming. Given these, generating synthetic data to augment algorithm training has been a focus of recent research. Here we discuss the use of synthetic imagery in an object detection framework, and describe a simulation based approach to determining domain-informed threat image projection (TIP) augmentation.
In this position paper, we discuss exciting recent advancements in sketching algorithms applied to distributed systems. That is, we look at randomized algorithms that simultaneously reduce the data dimensionality, offer potential privacy benefits, while maintaining verifiably high levels of algorithm accuracy and performance in multi-node computational setups. We look at next steps and discuss the applicability to real systems.
Graphs are a widely used abstraction for representing a variety of important real-world problems including emulating cyber networks for situational awareness, or studying social networks to understand human interactions or pandemic spread. Communication data is often converted into graphs to help understand social and technical patterns in the underlying communication data. However, prior to this project, little work had been performed analyzing how best to develop graphs from such data. Thus, many critical, national security problems were being performed against graph representations of questionable quality. Herein, we describe our analyses that were precursors to our final statistically grounded technique for creating static graph snapshots from a stream of communication events. The first analyzes the statistical distribution properties of a variety of real-world communication datasets generally fit best by Pareto, log normal, and extreme value distributions. The second derives graph properties that can be estimated given the expected statistical distribution for communication events and the communication interval to be viewed node observability, edge observability, and expected accuracy of node degree. Unfortunately, as that final process is under review for publication, we can't publish it here at this time.