Water security and climate change are important priorities for communities and regions worldwide. The intersections between water and climate change extend across many environmental and human activities. This Primer is intended as an introduction, grounded in examples, for students and others considering the interactions between climate, water, and society. In this Primer, we summarize key intersections between water and climate across four sectors: environment; drinking water, sanitation, and hygiene; food and agriculture; and energy. We begin with an overview of the fundamental water dynamics within each of these four sectors, and then discuss how climate change is impacting water and society within and across these sectors. Emphasizing the relationships and interconnectedness between water and climate change can encourage systems thinking, which can show how activities in one sector may influence activities or outcomes in other sectors. We argue that to achieve a resilient and sustainable water future under climate change, proposed solutions must consider the water–climate nexus to ensure the interconnected roles of water across sectors are not overlooked. Toward that end, we offer an initial set of guiding questions that can be used to inform the development of more holistic climate solutions. This article is categorized under: Science of Water > Water and Environmental Change Engineering Water > Water, Health, and Sanitation Human Water > Value of Water.
Organizations play a key role in supporting various societal functions, ranging from environmental governance to the manufacturing of goods. Here, the behaviors of organization are impacted by various influences, including information, technology, authority, economic leverage, historical experiences, and external factors, such as regulations. This paper introduces a generalized framework, focused on the relative structure of an organization (tight vs. loose), that can be used to understand how different influence pathways can impact decision-making within differently structured organizations. This generalized framework is then translated into a modeling and simulation platform to support and assess implications of these structural differences in resilience to disinformation (measured by organizational behaviors of timeliness and inclusion of quality information) using a systems dynamics approach Preliminary results indicate that a tightly structured organization may be less timely at processing information but could be more resilient against using poor quality information in organizational decisions compared to a loosely structured organization. Ongoing work is underway to understand the robustness of these findings and to validate current model design activities with empirical insights.
This report summarizes important nuances in local water concerns and potential climate impacts that could influence the roll-out of technologies associated with energy transitions. Current investments in clean energy technologies are very high, which is driving a lot of investments in related manufacturing (i.e., hydrogen, solar, wind, and batteries) and mining (e.g., lithium, copper, and graphite) around the world. To understand how water and climate dynamics could be influencing these activities, we conducted a phased literature review for three countries: China, Germany, and France. China was selected due to its global dominance in manufacturing of solar panels, batteries, and electrolyzers as well as production of rare earth elements while Germany and France were selected due to their emerging leadership in energy transitions-related manufacturing within the European Union. For each of these three nations, we identified areas where manufacturing is occurring within the country and then evaluated relevant water resources and climate impacts. Multiple sources were consulted for this review, including BloombergNEF, international reports, industry sources, peer-reviewed literature, climate data, and media coverage.
The siting of nuclear waste is a process that requires consideration of concerns of the public. This report demonstrates the significant potential for natural language processing techniques to gain insights into public narratives around “nuclear waste.” Specifically, the report highlights that the general discourse regarding “nuclear waste” within the news media has fluctuated in prevalence compared to “nuclear” topics broadly over recent years, with commonly mentioned entities reflecting a limited variety of geographies and stakeholders. General sentiments within the “nuclear waste” articles appear to use neutral language, suggesting that a scientific or “facts-only” framing of “waste”-related issues dominates coverage; however, the exact nuances should be further evaluated. The implications of a number of these insights about how nuclear waste is framed in traditional media (e.g., regarding emerging technologies, historical events, and specific organizations) are discussed. This report lays the groundwork for larger, more systematic research using, for example, transformer-based techniques and covariance analysis to better understand relationships among “nuclear waste” and other nuclear topics, sentiments of specific entities, and patterns across space and time (including in a particular region). By identifying priorities and knowledge needs, these data-driven methods can complement and inform engagement strategies that promote dialogue and mutual learning regarding nuclear waste.
There is currently very limited research into how experts analyze and assess potentially fraudulent content in their expertise areas, and most research within the disinformation space involves very limited text samples (e.g., news headlines). The overarching goal of the present study was to explore how an individual’s psychological profile and the linguistic features in text might influence an expert’s ability to discern disinformation/fraudulent content in academic journal articles. At a high level, the current design tasked experts with reading journal articles from their area of expertise and indicating if they thought an article was deceptive or not. Half the articles they read were journal papers that had been retracted due to academic fraud. Demographic and psychological inventory data collected on the participants was combined with performance data to generate insights about individual expert susceptibility to deception. Our data show that our population of experts were unable to reliably detect deception in formal technical writing. Several psychological dimensions such as comfort with uncertainty and intellectual humility may provide some protection against deception. This work informs our understanding of expert susceptibility to potentially fraudulent content within official, technical information and can be used to inform future mitigative efforts and provide a building block for future disinformation work.
This report summarizes the water inputs associated with four technologies playing diverse roles in energy transitions: hydrogen, solar photovoltaics (PV), wind, and batteries. Information in this report is drawn from multiple sources, including peer-reviewed literature, industry and international agency reports, EcoInvent life cycle inventory database, and subject matter expert (SME) consultations. Where possible, insights that characterized water requirements for specific stages of the technology development (e.g., operations, manufacturing, and mining) were prioritized over broader cradle-to-gate assessment values. Furthermore, both direct and indirect water requirements (i.e., associated with associated energy inputs) were considered in this literature review.
The purpose of pvOps is to support empirical evaluations of data collected in the field related to the operations and maintenance (O&M) of photovoltaic (PV) power plants. pvOps presently contains modules that address the diversity of field data, including text-based maintenance logs, current-voltage (IV) curves, and timeseries of production information. The package functions leverage machine learning, visualization, and other techniques to enable cleaning, processing, and fusion of these datasets. These capabilities are intended to facilitate easier evaluation of field patterns and extraction of relevant insights to support reliability-related decision-making for PV sites. The open-source code, examples, and instructions for installing the package through PyPI can be accessed through the GitHub repository.
Climate and its impacts on the natural environment, and on the ability of the natural environment to support population and the built environment, stands as a threat multiplier that impacts national and global security. The Water Intersections with Climate Systems Security (WICSS) Strategic Initiative is designed to improve understanding of water’s role in, among other topics, the connection of critical infrastructure to climate in light of competing national and global security interests (including transboundary issues and stability), and identifying research gaps aligned with Sandia, and Federal agency priorities. With this impetus in mind, the WICSS Strategic Initiative team conceptualized a causal loop diagram (CLD) of the relationship between and among climate, the natural environment, population, and the built environment, with an understanding that any such regionally focused system must have externalities that influence the system from beyond its’ control, and metrics for better understanding the consequences of the set of interactions. These are discussed in light of a series of worldviews that focus on portions of the overall systems relationship. The relationships are described and documented in detail. A set of reinforcing and balancing loops are then highlighted within the context of the model. Finally, forward-looking actions are highlighted to describe how this conceptual model can be turned into modeling to address multiple problems described under the purview of the Strategic Initiative.
Global climate change has prompted many national plans for rapid emissions reductions. For example, the United States recently committed to transitioning to 100% carbon-free electricity by 2035 and net-zero emissions economy-wide by 2050. Parallel to conversations surrounding emissions reductions is the call for energy justice, or the demand for more equitable distribution of energy-related burdens and benefits among communities. To date, energy justice has evolved as a mostly academic conversation, which may limit its utility to praxis. In response, we offer an interdisciplinary framework that aims to organize existing knowledge and lessons learned from energy development. Specifically, we developed the Meaningful Marine Renewable Energy (MRE) Development Framework and conducted a literature review using MRE as a case study. MRE was chosen because it is a nascent renewable energy technology in the US with projects mostly in demonstration stages and no commercial deployment, making it a useful case study to apply lessons learned from other energy sectors and other countries. Discussion of current resources being developed among the MRE community and their implications for furthering energy justice priorities are also explored. We conclude the review with a compiled list of questions meant to support stakeholders in translating theoretical concepts of Meaningful MRE Development to practice. Although the Meaningful MRE framework was developed using MRE as a use case, our interdisciplinary theoretical framework can be applied beyond MRE to other sustainable and renewable energy projects.
Both human subject experiments and computational, modeling and simulations have been used to study detection of deception. This work aims to combine these two methods by integrating empirically-derived information (from human subject experiments) into agent-based models to generate novel insights into the complex problems of detection of disinformation content. Computational experiments are used to simulate across multiple scenarios for evaluation and decision-making regarding the validity of potentially deceptive scientific documents. Factors influencing the human agent behaviors in the model were identified through a human subject experiment that was conducted to evaluate and characterize decision making related to disinformation discernment. Correlation and regression analyses were used to translate insights from the human subjects experiment to inform the parameterization of agent features and scenario development. Three scenarios were evaluated with the agent-based models to help evaluate the replicability of the simulations (validation analysis) and assess the influence of human agent and document features (sensitivity analyses). A replication of the human participant experiment demonstrated that the agent-based simulations compare favorably to empirical findings. The agent-based modeling was then used to conduct sensitivity analysis on the accuracy of deception detection as a function of document proportions and human agent features. Results indicate that precision values are adversely impacted when the proportion of deceptive documents is lower in the overall sample, whereas recall values are more sensitive to changes in human agent features. These findings indicate important nuances in accuracy evaluations that should be further considered (including consideration of potential alternate metrics) in future agent-based models of disinformation. Additional areas for future exploration include extension of simulations to consider other ways to align the agent-based model design with psychological theory and inclusion of agent-agent interactions, especially as it pertains to sharing of scientific information within an organizational context.
This paper explores the utility of organizational system modeling frameworks to provide valuable insight into information flows within organizations and subsequently the opportunities for increasing resilience against disinformation campaigns targeting the system's ability to utilize information within its decision making. Disinformation is a growing challenge for many organizations and in recent years has created delay in decision making. Here the paper has utilized the viable systems model (VSM) to characterize organizational systems and used this approach to outline potential subsystem requirements to promote resilience of the system. The results of this paper can support the development of simulations and models considering the human elements within the system as well as support the development of quantitative measures of resilience.
Sustainable use of water resources continues to be a challenge across the globe. This is in part due to the complex set of physical and social behaviors that interact to influence water management from local to global scales. Analyses of water resources have been conducted using a variety of techniques, including qualitative evaluations of media narratives. This study aims to augment these methods by leveraging computational and quantitative techniques from the social sciences focused on text analyses. Specifically, we use natural language processing methods to investigate a large corpus (approx. 1.8M) of newspaper articles spanning approximately 35 years (1982–2017) for insights into human-nature interactions with water. Focusing on local and regional United States publications, our analysis demonstrates important dynamics in water-related dialogue about drinking water and pollution to other critical infrastructures, such as energy, across different parts of the country. Our assessment, which looks at water as a system, also highlights key actors and sentiments surrounding water. Extending these analytical methods could help us further improve our understanding of the complex roles of water in current society that should be considered in emerging activities to mitigate and respond to resource conflicts and climate change.
Risk and resilience assessments for critical infrastructure focus on myriad objectives, from natural hazard evaluations to optimizing investments. Although research has started to characterize externalities associated with current or possible future states, incorporation of equity priorities at project inception is increasingly being recognized as critical for planning related activities. However, there is no standard methodology that guides development of equity-informed quantitative approaches for infrastructure planning activities. To address this gap, we introduce a logic model that can be tailored to capture nuances about specific geographies and community priorities, effectively incorporating them into different mathematical approaches for quantitative risk assessments. Specifically, the logic model uses a graded, iterative approach to clarify specific equity objectives as well as inform the development of equations being used to support analysis. We demonstrate the utility of this framework using case studies spanning aviation fuel, produced water, and microgrid electricity infrastructures. For each case study, the use of the logic model helps clarify the ways that local priorities and infrastructure needs are used to drive the types of data and quantitative methodologies used in the respective analyses. The explicit consideration of methodological limitations (e.g., data mismatches) and stakeholder engagements serves to increase the transparency of the associated findings as well as effectively integrate community nuances (e.g., ownership of assets) into infrastructure assessments. Such integration will become increasingly important to ensure that planning activities (which occur throughout the lifecycle of the infrastructure projects) lead to long-lasting solutions to meet both energy and sustainable development goals for communities.
Battery storage systems are increasingly being installed at photovoltaic (PV) sites to address supply-demand balancing needs. Although there is some understanding of costs associated with PV operations and maintenance (O&M), costs associated with emerging technologies such as PV plus storage lack details about the specific systems and/or activities that contribute to the cost values. This study aims to address this gap by exploring the specific factors and drivers contributing to utility-scale PV plus storage systems (UPVS) O&M activities costs, including how technology selection, data collection, and related and ongoing challenges. Specifically, we used semi-structured interviews and questionnaires to collect information and insights from utility-scale owners and operators. Data was collected from 14 semi-structured interviews and questionnaires representing 51.1 MW with 64.1 MWh of installed battery storage capacity within the United States (U.S.). Differences in degradation rate, expected life cycle, and capital costs are observed across different storage technologies. Most O&M activities at UPVS related to correcting under-performance. Fires and venting issues are leading safety concerns, and owner operators have installed additional systems to mitigate these issues. There are ongoing O&M challenges due the lack of storage-specific performance metrics as well as poor vendor reliability and parts availability. Insights from this work will improve our understanding of O&M consideration at PV plus storage sites.
The Grey Zone Test Range (GZTR) social model operates as a piece of the overall GZTR modeling effort. It works in conjunction with supply models for resources, an electric grid model for power availability, and a traffic model for road congestion, as well as a general controller framework that allows external system effects. The social model functions as an aggregate model where the entire population of the city is divided into groups based on the Transportation Analysis Zones (TAZs), a common geospatial boundary present in all GZTR models. These groups will act as a singular community; each time step the state of the system around them will be assessed and then community will come up with a general plan of action that they will attempt to follow for the day. Additionally, they will track values for their general emotional state and memory about negative impacts in the near past.
Failure detection methods are of significant interest for photovoltaic (PV) site operators to help reduce gaps between expected and observed energy generation. Current approaches for field-based fault detection, however, rely on multiple data inputs and can suffer from interpretability issues. In contrast, this work offers an unsupervised statistical approach that leverages hidden Markov models (HMM) to identify failures occurring at PV sites. Using performance index data from 104 sites across the United States, individual PV-HMM models are trained and evaluated for failure detection and transition probabilities. This analysis indicates that the trained PV-HMM models have the highest probability of remaining in their current state (87.1% to 93.5%), whereas the transition probability from normal to failure (6.5%) is lower than the transition from failure to normal (12.9%) states. A comparison of these patterns using both threshold levels and operations and maintenance (O&M) tickets indicate high precision rates of PV-HMMs (median = 82.4%) across all of the sites. Although additional work is needed to assess sensitivities, the PV-HMM methodology demonstrates significant potential for real-time failure detection as well as extensions into predictive maintenance capabilities for PV.
A proof-of-concept tool, the Produced Water-Economic, Socio, Environmental Simulation model (PW-ESESim), was developed to support ease of analysis. The tool was designed to facilitate head-to-head comparison of alternative produced water source, treatment, and reuse water management strategies. A graphical user interface (GUI) guides the user through the selection and design of alternative produced water treatment and reuse strategies and the associated health and safety risk and economic benefits. At the highest conceptual level, alternative water strategies include the selection of a source water (locally or regionally available produced water), treatment strategy (pre-treatment, physical, chemical, biological, desalination, and post-treatment processes) and product water purpose (e.g., irrigation, industrial processing, environmental). After selection of these details, the PW-ESESim output a number of key economic, societal, environmental, public/ecological health and safety metrics to support user decision-making; specific examples include, cost of treatment, improvements in freshwater availability, human and ecologic health impacts and growth in local jobs and the economy. Through the simulation of different produced water treatment and management strategies, tradeoffs are identified and used to inform fit-for-purpose produced water treatment and reuse management decisions. While the tool was initially designed using Southeastern New Mexico (Permian Basin) as a case study, the general design of the PW-ESESim model can be extended to support other oil and gas regions of the U.S.
Drinking water has and will continue to be at the foundation of our nation’s well-being and there is a growing interest in United States (US) drinking water quality. Nearly 30% of the United States population obtained their water from community water systems that did not meet federal regulations in 2019. Given the heavy interactions between society and drinking water quality, this study integrates social constructionism, environmental injustice, and sociohydrological systems to evaluate local awareness of drinking water quality issues. By employing text analytics, we explore potential drivers of regional water quality narratives within 25 local news sources across the United States. Specifically, we assess the relationship between printed local newspapers and water quality violations in communities as well as the influence of social, political, and economic factors on the coverage of drinking water quality issues. Results suggest that the volume and/or frequency of local drinking water violations is not directly reflected in local news coverage. Additionally, news coverage varied across sociodemographic features, with a negative relationship between Hispanic populations and news coverage of Lead and Copper Rule, and a positive relationship among non-Hispanic white populations. These findings extend current understanding of variations in local narratives to consider nuances of water quality issues and indicate opportunities for increasing equity in environmental risk communication.
There has been ever-growing interest and engagement regarding net-zero and carbon neutrality goals, with many nations committing to steep emissions reductions by mid-century. Although water plays critical roles in various sectors, there has been a distinct gap in discussions to date about the role of water in the transition to a carbon neutral future. To address this need, a webinar was convened in April 2022 to gain insights into how water can support or influence active strategies for addressing emissions activities across energy, industrial, and carbon sectors. The webinar presentations and discussions highlighted various nuances of direct and indirect water use both within and across technology sectors (Figure ES-1). For example, hydrogen and concrete production, water for mining, and inland waterways transportation are all heavily influenced by the energy sources used (fossil fuels vs. renewable sources) as well as local resource availabilities. Algal biomass, on the other hand, can be produced across diverse geographies (terrestrial to sea) in a range of source water qualities, including wastewater and could also support pollution remediation through nutrient and metals recovery. Finally, water also influences carbon dynamics and cycling within natural systems across terrestrial, aquatic, and geologic systems. These dynamics underscore not only the critical role of water within the energy-water nexus, but also the extension into the energy-watercarbon nexus.
Although unique expected energy models can be generated for a given photovoltaic (PV) site, a standardized model is also needed to facilitate performance comparisons across fleets. Current standardized expected energy models for PV work well with sparse data, but they have demonstrated significant over-estimations, which impacts accurate diagnoses of field operations and maintenance issues. This research addresses this issue by using machine learning to develop a data-driven expected energy model that can more accurately generate inferences for energy production of PV systems. Irradiance and system capacity information was used from 172 sites across the United States to train a series of models using Lasso linear regression. The trained models generally perform better than the commonly used expected energy model from international standard (IEC 61724-1), with the two highest performing models ranging in model complexity from a third-order polynomial with 10 parameters (R2adj= 0.994) to a simpler, second-order polynomial with 4 parameters (R2adj= 0.993), the latter of which is subject to further evaluation. Subsequently, the trained models provide a more robust basis for identifying potential energy anomalies for operations and maintenance activities as well as informing planning-related financial assessments. We conclude with directions for future research, such as using splines to improve model continuity and better capture systems with low (≤1000 kW DC) capacity.
Security assessments support decision-makers' ability to evaluate current capabilities of high consequence facilities (HCF) to respond to possible attacks. However, increasing complexity of today's operational environment requires a critical review of traditional approaches to ensure that implemented assessments are providing relevant and timely insights into security of HCFs. Using interviews and focus groups with diverse subject matter experts (SMEs), this study evaluated the current state of security assessments and identified opportunities to achieve a more "ideal" state. The SME-based data underscored the value of a systems approach for understanding the impacts of changing operational designs and contexts (as well as cultural influences) on security to address methodological shortcomings of traditional assessment processes. These findings can be used to inform the development of new approaches to HCF security assessments that are able to more accurately reflect changing operational environments and effectively mitigate concerns arising from new adversary capabilities.
Community, corporate, and government organizations are being targeted by disinformation attacks at an unprecedented rate. These attacks interrupt the ability of organizations to make high-consequence decisions and can lower their confidence in datasets and analytics. New interdisciplinary research approaches are being actively developed to expand resilience theory applications to organizations, and to determine the metrics and mitigations needed to increase resilience against disinformation. This paper presents initial ideas on adapting resilience methodologies for organizations and disinformation, highlighting key areas that require further exploration in this emerging field of research.
The global energy system is undergoing significant changes, including a shift in energy generating technologies to more renewable energy sources. However, the dependence of renewable energy sources on local environmental conditions could also increase disruptions in service through exposures to compound, extreme weather events. By fusing three diverse datasets (operations and maintenance tickets, weather data, and production data), this analysis presents a novel methodology to identify and evaluate performance impacts arising from extreme weather events across diverse geographical regions. Text analysis of maintenance tickets identified snow, hurricanes, and storms as the leading extreme weather events affecting photovoltaic plants in the United States. Statistical techniques and machine learning were then implemented to identify the magnitude and variability of these extreme weather impacts on site performance. Impacts varied between event and non-event days, with snow events causing the greatest reductions in performance (54.5%), followed by hurricanes (12.6%) and storms (1.1%). Machine learning analysis identified key features in determining if a day is categorized as low performing, such as low irradiance, geographic location, weather features, and site size. This analysis improves our understanding of compound, extreme weather event impacts on photovoltaic systems. These insights can inform planning activities, especially as renewable energy continues to expand into new geographic and climatic regions around the world.
U.S. critical infrastructure assets are often designed to operate for decades, and yet long-term planning practices have historically ignored climate change. With the current pace of changing operational conditions and severe weather hazards, research is needed to improve our ability to translate complex, uncertain risk assessment data into actionable inputs to improve decision-making for infrastructure planning. Decisions made today need to explicitly account for climate change – the chronic stressors, the evolution of severe weather events, and the wide-ranging uncertainties. If done well, decision making with climate in mind will result in increased resilience and decreased impacts to our lives, economies, and national security. We present a three-tier approach to create the research products needed in this space: bringing together climate projection data, severe weather event modeling, asset-level impacts, and contextspecific decision constraints and requirements. At each step, it is crucial to capture uncertainties and to communicate those uncertainties to decision-makers. While many components of the necessary research are mature (i.e., climate projection data), there has been little effort to develop proven tools for long-term planning in this space. The combination of chronic and acute stressors, spatial and temporal uncertainties, and interdependencies among infrastructure sectors coalesce into a complex decision space. By applying known methods from decision science and data analysis, we can work to demonstrate the value of an interdisciplinary approach to climate-hazard decision making for longterm infrastructure planning.
This paper describes how performance problems can be “masked,” or not readily evident by several causes: by photovoltaic (PV) system configuration (such as the size of the PV array capacity relative to the size of the inverter and the resultant clipped operating mode); by instrumentation design, installation, and maintenance (such as a misaligned or dirty pyranometer); by contract clauses (when operational availability is transformed to contractual availability, which excludes many factors); and by identified management and operational practices (such as reporting on a portfolio of plants rather than individually). A simple method based on a duration curve is introduced to overcome shortcomings of Performance Ratio based on nameplate capacity and Performance Index based on hourly simulation when quantifying masking effects, and inverter clipping and pyranometer soiling are presented as two examples of the new method. With a better understanding of the non-transparency of masking issues, stakeholders can better interpret performance data and deliver improved AC and DC plant conditions through PV system operation and maintenance (O&M) for improved performance, reduced O&M costs, and a more consistently delivered, and reduced, levelized cost of energy (LCOE).
PV system reliability analyses often depend on production data to evaluate the system state. However, using this information alone leads to incomplete assessments, since contextual information about potential sources of data quality issues is lacking (e.g., missing data from offline communications vs. offline production). This paper introduces a new Python-based software capability (called pvOps) for fusing production data with readily available text-based maintenance information to improve reliability assessments. In addition to details about the package development process, the general capabilities to gain actionable insights using field data are presented through a case study. These findings highlight the significant potential for continued advancements in operational assessments.
PV system reliability analyses often depend on production data to evaluate the system state. However, using this information alone leads to incomplete assessments, since contextual information about potential sources of data quality issues is lacking (e.g., missing data from offline communications vs. offline production). This paper introduces a new Python-based software capability (called pvOps) for fusing production data with readily available text-based maintenance information to improve reliability assessments. In addition to details about the package development process, the general capabilities to gain actionable insights using field data are presented through a case study. These findings highlight the significant potential for continued advancements in operational assessments.
In this paper we consider the effects of corporate hierarchies on innovation spread across multilayer networks, modeled by an elaborated SIR framework. We show that the addition of management layers can significantly improve spreading processes on both random geometric graphs and empirical corporate networks. Additionally, we show that utilizing a more centralized working relationship network rather than a strict administrative network further increases overall innovation reach. In fact, this more centralized structure in conjunction with management layers is essential to both reaching a plurality of nodes and creating a stable adopted community in the long time horizon. Further, we show that the selection of seed nodes affects the final stability of the adopted community, and while the most influential nodes often produce the highest peak adoption, this is not always the case. In some circumstances, seeding nodes near but not in the highest positions in the graph produces larger peak adoption and more stable long-time adoption.
Resilience has been defined as a priority for the US critical infrastructure. This paper presents a process for incorporating resiliency-derived metrics into security system evaluations. To support this analysis, we used a multi-layer network model (MLN) reflecting the defined security system of a hypothetical nuclear power plant to define what metrics would be useful in understanding a system's ability to absorb perturbation (i.e., system resilience). We defined measures focusing on the system's criticality, rapidity, diversity, and confidence at each network layer, simulated adversary path, and the system as a basis for understanding the system's resilience. For this hypothetical system, our metrics indicated the importance of physical infrastructure to overall system criticality, the relative confidence of physical sensors, and the lack of diversity in assessment activities (i.e., dependence on human evaluations). Refined model design and data outputs will enable more nuanced evaluations into temporal, geospatial, and human behavior considerations. Future studies can also extend these methodologies to capture respond and recover aspects of resilience, further supporting the protection of critical infrastructure.
Performance measures commonly used in systems security engineering tend to be static, linear, and have limited utility in addressing challenges to security performance from increasingly complex risk environments, adversary innovation, and disruptive technologies. Leveraging key concepts from resilience science offers an opportunity to advance next-generation systems security engineering to better describe the complexities, dynamism, and non-linearity observed in security performance—particularly in response to these challenges. This article introduces a multilayer network model and modified Continuous Time Markov Chain model that explicitly captures interdependencies in systems security engineering. The results and insights from a multilayer network model of security for a hypothetical nuclear power plant introduce how network-based metrics can incorporate resilience concepts into performance metrics for next generation systems security engineering.
Ongoing operations and maintenance (O&M) are needed to ensure photovoltaic (PV) systems continue to operate and meet production targets over the lifecycle of the system. Although average costs to operate and maintain PV systems have been decreasing over time, reported costs can vary significantly at the plant level. Estimating O&M costs accurately is important for informing financial planning and tracking activities, and subsequently lowering the levelized cost of electricity (LCOE) of PV systems. This report describes a methodology for improving O&M planning estimates by using empirically-derived failure statistics to capture component reliability in the field. The report also summarizes failure patterns observed for specific PV components and local environmental conditions observed in Sandia's PV Reliability, Operations & Maintenance (PVROM) database, a collection of field records across 800+ systems in the U.S. Where system-specific or fleet-specific data are lacking, PVROM-derived failure distribution values can be used to inform cost modeling and other reliability analyses to evaluate opportunities for performance improvements.
Principal component analysis (PCA) reduces dimensionality by generating uncorrelated variables and improves the interpretability of the sample space. This analysis focused on assessing the value of PCA for improving the classification accuracy of failures within current-voltage (IV) traces. Our results show that combining PCA with random forests improves classification by only ~1% (bringing the accuracy to >99%), compared to a baseline of only random forests (without PCA) of >98%. The inclusion of PCA, however, does provide an opportunity to study an interesting representation of all of the features on a single, two-dimensional feature space. A visualization of the first two principal components (similar to IV profile but rotated) captures how the inclusion of a current differential feature causes a notable separation between failure modes due to their effect on the slope. This work continues the discussion of generating different ways of extracting information from the IV curve, which can help with failure classification - especially for failures that only exhibit marginal profile changes in IV curves.
Sandia National Laboratories is part of the government test and evaluation team for the Defense Advanced Research Projects Agency Collection and Monitoring via Planning for Active Situational Scenarios program. The program is designed to better understand competition in the area between peace and conventional conflict when adversary actions are subtle and difficult to detect. For the purposes of test and evaluation, Sandia conducted a range of activities for the program: creation of the Grey Zone Test Range; design of the data stream for a user experiment conducted with U.S. Indo-Pacific Command; design, implementation, and execution of the formal evaluation; and analysis and summary of the evaluation results. This report details Sandia's activities and provides additional information on the Grey Zone Test Range urban simulation environment developed to evaluate the performer technologies.
Accurately predicting power generation for PV sites is critical for prioritizing relevant operations & maintenance activities, thereby extending the lifetime of a system and improving profit margins. A number of factors influence power generation at PV sites, including local weather, shading and soiling losses, design of modules, DC mismatches, and degradation over time. Other external factors such as curtailment and grid outages can also have a notable impact on power generation. Machine learning techniques can be used to provide more accurate predictions of PV power production by accounting for important weather and climate information neglected by current industry methods. This article will cover the deficiencies of those methods and will show how machine learning can dramatically improve power generation predictions.
Accurate diagnosis of failures is critical for meeting photovoltaic (PV) performance objectives and avoiding safety concerns. This analysis focuses on the classification of field-collected string-level current-voltage (IV) curves representing baseline, partial soiling, and cracked failure modes. Specifically, multiple neural network-based architectures (including convolutional and long short-term memory) are evaluated using domain-informed parameters across different portions of the IV curve and a range of irradiance thresholds. The analysis identified two models that were able to accurately classify the relatively small dataset (400 samples) at a high accuracy (99%+). Findings also indicate optimal irradiance thresholds and opportunities for improvements in classification activities by focusing on portions of the IV curve. Such advancements are critical for expanding accurate classification of PV faults, especially for those with low power loss (e.g., cracked cells) or visibly similar IV curve profiles.
Inverters are a leading source of hardware failures and contribute to significant energy losses at photovoltaic (PV) sites. An understanding of failure modes within inverters requires evaluation of a dataset that captures insights from multiple characterization techniques (including field diagnostics, production data analysis, and current-voltage curves). One readily available dataset that can be leveraged to support such an evaluation are maintenance records, which are used to log all site-related technician activities, but vary in structuring of information. Using machine learning, this analysis evaluated a database of 55,000 maintenance records across 800+ sites to identify inverter-related records and consistently categorize them to gain insight into common failure modes within this critical asset. Communications, ground faults, heat management systems, and insulated gate bipolar transistors emerge as the most frequently discussed inverter subsystems. Further evaluation of these failure modes identified distinct variations in failure frequencies over time and across inverter types, with communication failures occurring more frequently in early years. Increased understanding of these failure patterns can inform ongoing PV system reliability activities, including simulation analyses, spare parts inventory management, cost estimates for operations and maintenance, and development of standards for inverter testing. Advanced implementations of machine learning techniques coupled with standardization of asset labels and descriptions can extend these insights into actionable information that can support development of algorithms for condition-based maintenance, which could further reduce failures and associated energy losses at PV sites.
We show seasonal runoff from montane uplands is crucial for plant growth in agricultural communities of northern New Mexico. These communities typically employ traditional irrigation systems, called acequias, which rely mainly upon spring snowmelt runoff for irrigation. The trend of the past few decades is an increase in temperature, reduced snow pack, and earlier runoff from snowmelt across much of the western United States. In order to predict the potential impacts of changes in future climate a system dynamics model was constructed to simulate the surface water supplies in a montane upland watershed of a small irrigated community in northern New Mexico through the rest of the 21st century. End-term simulations of representative concentration pathways (RCP) 4.5 and 8.5 suggest that runoff during the months of April to August could be reduced by 22% and 56%, respectively. End-term simulations also displayed a shift in the beginning and peak of snowmelt runoff by up to one month earlier than current conditions. Results suggest that rising temperatures will drive reduced runoff in irrigation season and earlier snowmelt runoff in the dry season towards the end of the 21st century. Modeled results suggest that climate change leads to runoff scheme shift and increased frequency of drought; due to the uncontemporaneous of irrigation season and runoff scheme, water shortage will increase. Potential impacts of climate change scenarios and mitigation strategies should be further investigated to ensure the resilience of traditional agricultural communities in New Mexico and similar regions.
Food, energy, and water (FEW) are primary resources required for human populations and ecosystems. Availability of the raw resources is essential, but equally important are the services that deliver resources to human populations, such as adequate access to safe drinking water, electricity, and sufficient food. Any failures in either resource availability or FEW resources-related services will have an impact on human health. The ability of countries to intervene and overcome the challenges in the FEW domain depends on governance, education, and economic capacities. We distinguish between FEW resources, FEW services, and FEW health outcomes to develop an analysis framework for evaluating interrelationships among these critical resources. The framework is applied using a data-driven approach for sub-Saharan African countries, a region with notable FEW insecurity challenges. The data-driven approach using a cross-validated stepwise regression analysis indicates that limited governance and socioeconomic capacity in sub-Saharan African countries, rather than lack of the primary resources, more significantly impact access to FEW services and associated health outcomes. The proposed framework helps develop a cohesive approach for evaluating FEW metrics and could be applied to other regions of the world to continue improving our understanding of the FEW nexus.
The IEC 61215 and Qualification Plus indoor aging tests are recognized as valuable assessment procedures for identifying photovoltaic (PV) modules that are prone to early-life failures or excessive degradation. However, it is unclear how well the tests match with reality, and if they can predict in-field performance. Therefore, the present work performed indoor-aging thermal cycling tests on pristine-condition modules and evaluated, using in-field current and voltage (I-V) curve scans, modules of the same make and model exposed to the actual environment within a production field. The experiment included the estimate of the overall exposure to thermal cycling in both indoor and outdoor environments, the extraction of the series resistance from the I-V curves, the development of a model based on the indoor results, and finally the testing of the model on outdoor exposure amounts to predict actual changes in resistance. Index Terms - photovoltaic, accelerated aging, series resistance.
There is growing interest in nexus research: energy-water, energy-water-land, and more recently food-energy-water. Motivating this movement is the recognition that the dynamics and feedbacks that constitute these nexuses have been overlooked in the past but are critical to the planning and management of these interacting elements. Formal reviews have identified gaps in current studies. In this commentary, we highlight additional oversights that are hindering integration of findings in nexus studies, notably usage of imprecise terminology to describe analyses, a failure to close the loop by linking production with corresponding waste streams, and exclusion of dynamics linking diverse constituent elements. Equally lacking from current nexus studies is a consistent protocol for communicating the conceptual basis of our studies. To fill this gap, we draw on diverse perspectives and fields to propose a comprehensive and systematic framework that can guide the model conceptualization phase of nexus studies. We also present a standardized documentation practice (similar to one utilized by the agent-based modeling community) to facilitate communication of nexus studies. These initiatives can improve our ability to account for and communicate the nuanced, food-energy-water nexus interactions in a consistent manner, which is necessary to better inform risk analysis and avoid decisions with unintended consequences and hidden costs to society.
Narratives about water resources have evolved, transitioning from a sole focus on physical and biological dimensions to incorporate social dynamics Recently, the importance of understanding the visibility of water resources through media coverage has gained attention. This study leverages recent advancements in natural language processing (NLP) methods to characterize and understand patterns in water narratives, specifically in 4 local newspapers in Utah and Georgia. Analysis of the corpus identified coherent topics on a variety of water resources issues, including weather and pollution. Closer inspection of the topics revealed temporal and spatial variations in coverage, with a topic on hurricanes exhibiting cyclical patterns whereas a topic on tribal issues showed coverage predominantly in the western newspapers. We also analyzed the dataset for sentiments, identifying similar categories of words on trust and fear emerging in the narratives across newspaper sources. An analysis of novelty, transience, and resonance using Kullback-Leibler Divergence techniques revealed that topics with high novelty generally contained high transience and marginally high resonance over time. Although additional analysis needs to be conducted, the methods explored in this analysis demonstrate the potential of NLP methods to characterize water narratives in media coverage.
As the technological world expands, vulnerabilities of our critical infrastructure are becoming clear. Fortunately, emerging services provide an opportunity to improve the efficiency and security of current practices. In particular, serverless computing (such as Amazon Web Services and REDFISHs Acequia) provide opportunities to improve current practices. However, the critical infrastructure needs to evolve and that will require due diligence to ensure that transferring aspects of its practices onto the internet is done in a secure manner.
Sociohydrological studies use interdisciplinary approaches to explore the complex interactions between physical and social water systems and increase our understanding of emergent and paradoxical system behaviors. The dynamics of community values and social cohesion, however, have received little attention in modeling studies due to quantification challenges. Social structures associated with community-managed irrigation systems around the world, in particular, reflect these communities' experiences with a multitude of natural and social shocks. Using the Valdez acequia (a communally-managed irrigation community in northern New Mexico) as a simulation case study, we evaluate the impact of that community's social structure in governing its responses to water availability stresses posed by climate change. Specifically, a system dynamics model (developed using insights from community stakeholders and multiple disciplines that captures biophysical, socioeconomic, and sociocultural dynamics of acequia systems) was used to generate counterfactual trajectories to explore how the community would behave with streamflow conditions expected under climate change. We found that earlier peak flows, combined with adaptive measures of shifting crop selection, allowed for greater production of higher value crops and fewer people leaving the acequia. The economic benefits were lost, however, if downstream water pressures increased. Even with significant reductions in agricultural profitability, feedbacks associated with community cohesion buffered the community's population and land parcel sizes from more detrimental impacts, indicating the community's resilience under natural and social stresses. In conclusion, continued exploration of social structures is warranted to better understand these systems' responses to stress and identify possible leverage points for strengthening community resilience.