Network segmentation of a power grid's communication system can make the grid more resilient to cyberattacks. We develop a novel trilevel programming model to optimally segment a grid communication system, taking into account the actions of an information technology (IT) administrator, attacker, and grid operator. The IT administrator is allowed to segment existing networks, and the attacker is given a budget to inflict damage on the grid by attacking the segmented communication system. Finally, the grid operator can redispatch the grid after the attack to minimize damage. The resulting problem is a trilevel interdiction problem that we solve using a branch and bound algorithm for bilevel problems. We demonstrate the benefits of optimal network segmentation through case studies on the 9-bus Western System Coordinating Council (WSCC) system and the 30-bus IEEE system. These examples illustrate that network segmentation can significantly reduce the threat posed by a cyberattacker.
Widespread integration of social media into daily life has fundamentally changed the way society communicates, and, as a result, how individuals develop attitudes, personal philosophies, and worldviews. The excess spread of disinformation and misinformation due to this increased connectedness and streamlined communication has been extensively studied, simulated, and modeled. Less studied is the interaction of many pieces of misinformation, and the resulting formation of attitudes. We develop a framework for the simulation of attitude formation based on exposure to multiple cognitions. We allow a set of cognitions with some implicit relational topology to spread on a social network, which is defined with separate layers to specify online and offline relationships. An individual’s opinion on each cognition is determined by a process inspired by the Ising model for ferromagnetism. We conduct experimentation using this framework to test the effect of topology, connectedness, and social media adoption on the ultimate prevalence of and exposure to certain attitudes.
This report summarizes the activities performed as part of the Science and Engineering of Cybersecurity by Uncertainty quantification and Rigorous Experimentation (SECURE) Grand Challenge LDRD project. We provide an overview of the research done in this project, including work on cyber emulation, uncertainty quantification, and optimization. We present examples of integrated analyses performed on two case studies: a network scanning/detection study and a malware command and control study. We highlight the importance of experimental workflows and list references of papers and presentations developed under this project. We outline lessons learned and suggestions for future work.
As part of the Department of Energy response to the novel coronavirus disease (COVID-19) pandemic of 2020, a modeling effort was sponsored by the DOE Office of Science. Through this effort, an integrated planning framework was developed whose capabilities were demonstrated with the combination of a treatment resource demand model and an optimization model for routing supplies. This report documents this framework and models, and an application involving ventilator demands and supplies in the continental United States. The goal of this application is to test the feasibility of implementing nationwide ventilator sharing in response to the COVID-19 crisis. Multiple scenarios were run using different combinations of forecasted and observed patient streams, and it is demonstrated that using a "worst-case forecast for planning may be preferable to best mitigate supply-demand risks in an uncertain future. There is also a brief discussion of model uncertainty and its implications for the results.
Freight transportation represents about 9.5% of GDP in the U.S., it is responsible for about 8% of greenhouse gas emissions, and supports the import and export of about 3.6 trillion in international trade. It is therefore important that the national freight transportation system is designed and operated efficiently. Hence, this paper develops a mathematical model to estimate international and domestic freight flows across ocean, rail, and truck modes, which can be used to study the impacts of changes in our infrastructure, as well as the imposition of new user fees and changes in operating policies. The model integrates a user equilibrium-based logit argument for path selection with a system optimal argument for rail network operations. This leads to the development of a unique solution procedure that is demonstrated in a large-scale analysis focused on all intercity freight and U.S export/import containerized freight. The model results are compared with the reported flow volumes. The model is applied to two case studies: (1) a disruption of the seaports of Los Angeles and Long Beach (LA and LB) similar to the impacts that would be felt in an earthquake; and (2) implementation of new user fees at the California ports.
This simple Microgrid Design Toolkit (MDT) use case will provide you an example of performing microgrid sizing by identifying the types and quantities of technology to be purchased for use in a microgrid. It will introduce basic principles of using the MDT microgrid sizing capability by comparing the results of two microgrids in two different markets. Please reference the MDT User Guide (SAND2017-9374) for detailed instructions on how to use the tool.
This document provides implementation guidance for implementing personnel group FTE costs by JCA Tier 1 or 2 categories in the Contingency Contractor Optimization Tool – Engineering Prototype (CCOT-P). CCOT-P currently only allows FTE costs by personnel group to differ by mission. Changes will need to be made to the user interface inputs pages and the database
The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element can be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).
The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options. The results of this model allow those considering these options to understand the complexity tradeoffs between proposed warhead options. The core idea of SCORE is to divide a warhead option into a well- defined set of scope elements and then estimate the complexity of each scope element against a well understood reference system. The uncertainty associated with estimates can also be captured. A weighted summation of the relative complexity of each scope element is used to determine the total complexity of the proposed warhead option or portions of the warhead option (i.e., a National Work Breakdown Structure code). The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA- 12 led Enterprise Modeling and Analysis Consortium (EMAC), that has provided the data elicitation, integration and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).
This report summarizes the work performed as part of a Laboratory Directed Research and Development project focused on evaluating and mitigating risk associated with biological dual use research of concern. The academic and scientific community has identified the funding stage as the appropriate place to intervene and mitigate risk, so the framework developed here uses a portfolio-level approach and balances biosafety and biosecurity risks, anticipated project benefits, and available mitigations to identify the best available investment strategies subject to cost constraints. The modeling toolkit was designed for decision analysis for dual use research of concern, but is flexible enough to support a wide variety of portfolio-level funding decisions where risk/benefit tradeoffs are involved. Two mathematical optimization models with two solution methods are included to accommodate stakeholders with varying levels of certainty about priorities between metrics. An example case study is presented.
Sandia National Laboratories (Sandia) is in Phase 3 Sustainment of development of a prototype tool, currently referred to as the Contingency Contractor Optimization Tool - Prototype (CCOTP), under the direction of OSD Program Support. CCOT-P is intended to help provide senior Department of Defense (DoD) leaders with comprehensive insight into the global availability, readiness and capabilities of the Total Force Mix. The CCOT-P will allow senior decision makers to quickly and accurately assess the impacts, risks and mitigating strategies for proposed changes to force/capabilities assignments, apportionments and allocations options, focusing specifically on contingency contractor planning. During Phase 2 of the program, conducted during fiscal year 2012, Sandia developed an electronic storyboard prototype of the Contingency Contractor Optimization Tool that can be used for communication with senior decision makers and other Operational Contract Support (OCS) stakeholders. Phase 3 used feedback from demonstrations of the electronic storyboard prototype to develop an engineering prototype for planners to evaluate. Sandia worked with the DoD and Joint Chiefs of Staff strategic planning community to get feedback and input to ensure that the engineering prototype was developed to closely align with future planning needs. The intended deployment environment was also a key consideration as this prototype was developed. Initial release of the engineering prototype was done on servers at Sandia in the middle of Phase 3. In 2013, the tool was installed on a production pilot server managed by the OUSD(AT&L) eBusiness Center. The purpose of this document is to specify the CCOT-P engineering prototype platform requirements as of May 2016. Sandia developed the CCOT-P engineering prototype using common technologies to minimize the likelihood of deployment issues. CCOT-P engineering prototype was architected and designed to be as independent as possible of the major deployment components such as the server hardware, the server operating system, the database, and the web server. This document describes the platform requirements, the architecture, and the implementation details of the CCOT-P engineering prototype.
The Contingency Contractor Optimization Tool – Prototype (CCOT-P) database is used to store input and output data for the linear program model described in [1]. The database allows queries to retrieve this data and updating and inserting new input data.
This requirements document serves as an addendum to the Contingency Contractor Optimization Phase 2, Requirements Document [1] and Phase 3 Requirements Document [2]. The Phase 2 Requirements document focused on the high-level requirements for the tool. The Phase 3 Requirements document provided more detailed requirements to which the engineering prototype was built in Phase 3. This document will provide detailed requirements for features and enhancements being added to the production pilot in the Phase 3 Sustainment.
The reports and test plans contained within this document serve as supporting materials to the activities listed within the “Contingency Contractor Optimization Tool – Prototype (CCOT-P) Verification & Validation Plan” [1]. The activities included test development, testing, peer reviews, and expert reviews. The engineering prototype reviews were done for both the software and the mathematical model used in CCOT-P. Section 2 includes the peer and expert review reports, which summarize the findings from each of the reviews and document the resolution of any issues. Section 3 details the test plans that were followed for functional testing of the application through the interface. Section 4 describes the unit tests that were run on the code.
This tutorial walks the user through analysis examples using the Contingency Contractor Optimization Tool Prototype. The examples are designed to showcase key capabilities of the tool. The main goal of this tutorial is to provide examples of how to use the tool to perform analyses to those users acting in the analyst role. All examples and locations used in the prototype are fictional, but are intended to be realistic. Users reading this manual are expected to have a basic understanding and familiarity with the Contingency Contractor Optimization Tool Prototype. This tutorial includes scenarios for the occurrence of two wars, Prussia and New Granada.