Publications

8 Results

Search results

Jump to search filters

Quantifying Uncertainty in Emulations (LDRD Report)

Crussell, Jonathan C.; Brown, Aaron B.; Jennings, Jeremy K.; Kavaler, David; Kroeger, Thomas M.; Phillips, Cynthia A.

This report summarizes the work performed under the project “Quantifying Uncertainty in Emulations.” Emulation can be used to model real-world systems, typically using virtualization to run the real software on virtualized hardware. Emulations are increasingly used to answer mission-oriented questions, but how well they represent the real-world systems is still an open area of research. The goal of the project was to quantify where and how emulations differ from the real world. To do so, we ran a representative workload on both, and collected and compared metrics to identify differences. We aimed to capture behaviors, rather than performance, differences as the latter is more well-understood in the literature.

More Details

Lessons learned from 10k experiments to compare virtual and physical testbeds

12th USENIX Workshop on Cyber Security Experimentation and Test, CSET 2019, co-located with USENIX Security 2019

Crussell, Jonathan C.; Kroeger, Thomas M.; Kavaler, David; Brown, Aaron B.; Phillips, Cynthia A.

Virtual testbeds are a core component of cyber experimentation as they allow for fast and relatively inexpensive modeling of computer systems. Unlike simulations, virtual testbeds run real software on virtual hardware which allows them to capture unknown or complex behaviors. However, virtualization is known to increase latency and decrease throughput. Could these and other artifacts from virtualization undermine the experiments that we wish to run? For the past three years, we have attempted to quantify where and how virtual testbeds differ from their physical counterparts to address this concern. While performance differences have been widely studied, we aim to uncover behavioral differences. We have run over 10,000 experiments and processed over half a petabyte of data. Complete details of our methodology and our experimental results from applying that methodology are published in previous work. In this paper, we describe our lessons learned in the process of constructing and instrumenting both physical and virtual testbeds and analyzing the results from each.

More Details

Exploiting Time and Subject Locality for Fast, Efficient, and Understandable Alert Triage

2018 International Conference on Computing, Networking and Communications, ICNC 2018

Kavaler, David; Hudson, Corey H.; Bierma, Michael B.

In many organizations, intrusion detection and other related systems are tuned to generate security alerts, which are then manually inspected by cyber-security analysts. These analysts often devote a large portion of time to inspecting these alerts, most of which are innocuous. Thus, it would be greatly beneficial to reduce the number of innocuous alerts, allowing analysts to utilize their time and skills for other aspects of cyber defense. In this work, we devise several simple, fast, and easily understood models to cut back this manual inspection workload, while maintaining high true positive and true negative rates. We demonstrate their effectiveness on real data, and discuss their potential utility in application by others.

More Details
8 Results
8 Results