Open Charge Point Protocol (OCPP) 1.6 is widely used in the electric vehicle (EV) charging industry to communicate between Charging System Management Services (CSMSs) and Electric Vehicle Supply Equipment (EVSE). Unlike OCPP 2.0.1, OCPP 1.6 uses unencrypted websocket communications to exchange information between EVSE devices and an on-premise or cloud-based CSMS. In this work, we demonstrate two machine-in-the-middle attacks on OCPP sessions to terminate charging sessions and gain root access to the EVSE equipment via remote code execution. Second, we demonstrate a malicious firmware update with a code injection payload to compromise an EVSE. Lastly, we demonstrate two methods to prevent availability of the EVSE or CSMS. One of these, originally reported by SaiFlow, prevents traffic to legitimate EVSE equipment using a DoS-like attack on CSMSs by repeatedly connecting and authenticating several CPs with the same identities as the legitimate CP. These vulnerabilities were demonstrated with proof-of-concept exploits in a virtualized Cyber Range at Wright State University and/or with a 350 kW Direct Current Fast Charger at Idaho National Laboratory. The team found that OCPP 1.6 could be protected from these attacks by adding secure shell tunnels to the protocol, if upgrading to OCPP 2.0.1 was not an option.
Sandia National Laboratories and Idaho National Laboratory deployed state-of-the-art cybersecurity technologies within a virtualized, cyber-physical wind energy site to demonstrate their impact on security and resilience. This work was designed to better quantify cost-benefit tradeoffs and risk reductions when layering different security technologies on wind energy operational technology networks. Standardized step-by-step attack scenarios were drafted for adversaries with remote and local access to the wind network. Then, the team investigated the impact of encryption, access control, intrusion detection, security information and event management, and security, orchestration, automation, and response (SOAR) tools on multiple metrics, including physical impacts to the power system and termination of the adversary kill chain. We found, once programmed, the intrusion detection systems could detect attacks and the SOAR system was able to effectively and autonomously quarantine the adversary, prior to power system impacts. Cyber and physical metrics indicated network and endpoint visibility were essential to provide human defenders situational awareness to maintain system resilience. Certain hardening technologies, like encryption, reduced adversary access, but recognition and response were also critical to maintain wind site operations. Lastly, a cost-benefit analysis was performed to estimate payback periods for deploying cybersecurity technologies based on projected breach costs.
The goal of the project was to protect US critical infrastructure and improve energy security through technical analysis of the risk landscape presented by the anticipated massive deployment of interoperable EV chargers.
Sangoleye, Fisayo; Johnson, Jay; Chavez, Adrian R.; Tsiropoulou, Eirini E.; Marton, Nicholas L.; Hentz, Charles R.; Yannarelli, Albert
Microgrids require reliable communication systems for equipment control, power delivery optimization, and operational visibility. To maintain secure communications, Microgrid Operational Technology (OT) networks must be defensible and cyber-resilient. The communication network must be carefully architected with appropriate cyber-hardening technologies to provide security defenders the data, analytics, and response capabilities to quickly mitigate malicious and accidental cyberattacks. In this work, we outline several best practices and technologies that can support microgrid operations (e.g., intrusion detection and monitoring systems, response tools, etc.). Then we apply these recommendations to the New Jersey TRANSITGRID use case to demonstrate how they would be deployed in practice.
Fragkos, Georgios; Johnson, Jay; Tsiropoulou, Eirini E.
A global transition to power grids with high penetrations of renewable energy generation is being driven in part by rapid installations of distributed energy resources (DER). New DER equipment includes standardized IEEE 1547-2018 communication interfaces and proprietary communications capabilities. Interoperable DER provides new monitoring and control capabilities. The existence of multiple entities with different roles and responsibilities within the DER ecosystem makes the Access Control (AC) mechanism necessary. In this paper, we introduce and compare two novel architectures, which provide a Role-Based Access Control (RBAC) service to the DER ecosystem’s entities. Selecting an appropriate RBAC technology is important for the RBAC administrator and users who request DER access authorization. The first architecture is centralized, based on the OpenLDAP, an open source implementation of the Lightweight Directory Access Protocol (LDAP). The second approach is decentralized, based on a private Ethereum blockchain test network, where the RBAC model is stored and efficiently retrieved via the utilization of a single Smart Contract. We have implemented two end-to-end Proofs-of-Concept (PoC), respectively, to offer the RBAC service to the DER entities as web applications. Finally, an evaluation of the two approaches is presented, highlighting the key speed, cost, usability, and security features.
Role-based access control (RBAC) is adopted in the information and communication technology domain for authentication purposes. However, due to a very large number of entities within organizational access control (AC) systems, static RBAC management can be inefficient, costly, and can lead to cybersecurity threats. In this article, a novel hybrid RBAC model is proposed, based on the principles of offline deep reinforcement learning (RL) and Bayesian belief networks. The considered framework utilizes a fully offline RL agent, which models the behavioral history of users as a Bayesian belief-based trust indicator. Thus, the initial static RBAC policy is improved in a dynamic manner through off-policy learning while guaranteeing compliance of the internal users with the security rules of the system. By deploying our implementation within the smart grid domain and specifically within a Distributed Energy Resources (DER) ecosystem, we provide an end-To-end proof of concept of our model. Finally, detailed analysis and evaluation regarding the offline training phase of the RL agent are provided, while the online deployment of the hybrid RL-based RBAC model into the DER ecosystem highlights its key operation features and salient benefits over traditional RBAC models.
In the near future, grid operators are expected to regularly use advanced distributed energy resource (DER) functions, defined in IEEE 1547-2018, to perform a range of grid-support operations. Many of these functions adjust the active and reactive power of the device through commanded or autonomous operating modes which induce new stresses on the power electronics components. In this work, an experimental and theoretical framework is introduced which couples laboratory-measured component stress with advanced inverter functionality and derives a reduction in useful lifetime based on an applicable reliability model. Multiple DER devices were instrumented to calculate the additional component stress under multiple reactive power setpoints to estimate associated DER lifetime reductions. A clear increase in switch loss was demonstrated as a function of irradiance level and power factor. This is replicated in the system-level efficiency measurements, although magnitudes were different—suggesting other loss mechanisms exist. Using an approximate Arrhenius thermal model for the switches, the experimental data indicate a lifetime reduction of 1.5% when operating the inverter at 0.85 PF—compared to unity PF—assuming the DER failure mechanism thermally driven within the H-bridge. If other failure mechanisms are discovered for a set of power electronics devices, this testing and calculation framework can easily be tailored to those failure mechanisms.
As the U.S. electrifies the transportation sector, cyberattacks targeting vehicle charging could impact several critical infrastructure sectors including power systems, manufacturing, medical services, and agriculture. This is a growing area of concern as charging stations increase power delivery capabilities and must communicate to authorize charging, sequence the charging process, and manage load (grid operators, vehicles, OEM vendors, charging network operators, etc.). The research challenges are numerous and complicated because there are many end users, stakeholders, and software and equipment vendors interests involved. Poorly implemented electric vehicle supply equipment (EVSE), electric vehicle (EV), or grid operator communication systems could be a significant risk to EV adoption because the political, social, and financial impact of cyberattacks — or public perception of such — would ripple across the industry and produce lasting effects. Unfortunately, there is currently no comprehensive EVSE cybersecurity approach and limited best practices have been adopted by the EV/EVSE industry. There is an incomplete industry understanding of the attack surface, interconnected assets, and unsecured inter faces. Comprehensive cybersecurity recommendations founded on sound research are necessary to secure EV charging infrastructure. This project provided the power, security, and automotive industry with a strong technical basis for securing this infrastructure by developing threat models, determining technology gaps, and identifying or developing effective countermeasures. Specifically, the team created a cybersecurity threat model and performed a technical risk assessment of EVSE assets across multiple manufacturers and vendors, so that automotive, charging, and utility stakeholders could better protect customers, vehicles, and power systems in the face of new cyber threats.
Worldwide growth in electric vehicle use is prompting new installations of private and public electric vehicle supply equipment (EVSE). EVSE devices support the electrification of the transportation industry but also represent a linchpin for power systems and transportation infras-tructures. Cybersecurity researchers have recently identified several vulnerabilities that exist in EVSE devices, communications to electric vehicles (EVs), and upstream services, such as EVSE vendor cloud services, third party systems, and grid operators. The potential impact of attacks on these systems stretches from localized, relatively minor effects to long-term national disruptions. Fortunately, there is a strong and expanding collection of information technology (IT) and operational technology (OT) cybersecurity best practices that may be applied to the EVSE environment to secure this equipment. In this paper, we survey publicly disclosed EVSE vulnerabilities, the impact of EV charger cyberattacks, and proposed security protections for EV charging technologies.
In the near future, grid operators are expected to regularly use advanced distributed energy resource (DER) functions, defined in IEEE 1547-2018, to perform a range of grid-support operations. Many of these functions adjust the active and reactive power of the device through commanded or autonomous modes, which will produce new stresses on the grid-interfacing power electronics components, such as DC/AC inverters. In previous work, multiple DER devices were instrumented to evaluate additional component stress under multiple reactive power setpoints. We utilize quasi-static time-series simulations to determine voltage-reactive power mode (volt-var) mission profile of inverters in an active power system. Mission profiles and loss estimates are then combined to estimate the reduction of the useful life of inverters from different reactive power profiles. It was found that the average lifetime reduction was approximately 0.15% for an inverter between standard unity power factor operation and the IEEE 1547 default volt-var curve based on thermal damage due to switching in the power transistors. For an inverter with an expected 20-year lifetime, the 1547 volt-var curve would reduce the expected life of the device by 12 days. This framework for determining an inverter's useful life from experimental and modeling data can be applied to any failure mechanism and advanced inverter operation.
In the near future, grid operators are expected to regularly use advanced distributed energy resource (DER) functions, defined in IEEE 1547-2018, to perform a range of grid-support operations. Many of these functions adjust the active and reactive power of the device through commanded or autonomous modes, which will produce new stresses on the grid-interfacing power electronics components, such as DC/AC inverters. In previous work, multiple DER devices were instrumented to evaluate additional component stress under multiple reactive power setpoints. We utilize quasi-static time-series simulations to determine voltage-reactive power mode (volt-var) mission profile of inverters in an active power system. Mission profiles and loss estimates are then combined to estimate the reduction of the useful life of inverters from different reactive power profiles. It was found that the average lifetime reduction was approximately 0.15% for an inverter between standard unity power factor operation and the IEEE 1547 default volt-var curve based on thermal damage due to switching in the power transistors. For an inverter with an expected 20-year lifetime, the 1547 volt-var curve would reduce the expected life of the device by 12 days. This framework for determining an inverter's useful life from experimental and modeling data can be applied to any failure mechanism and advanced inverter operation.
Inverters convert DC power to AC power that can be injected into the grid. Many inverters offer multiple, independent maximum power point trackers (MPPTs) to accommodate photovoltaic arrays with different orientations or capacities. No validated model for overall DC-to-AC power conversion efficiency is available for such inverters. Herein, we propose a mathematical model that describes the efficiency of a multi-MPPT inverter and present validation using a commercial inverter with six MPPT inputs.
Currently, the solar industry is operating with little application-specific guidance on how to protect and defend their systems from cyberattacks. This 3-year Department of Energy (DOE) Solar Energy Technologies Office-funded project helped advance the distributed energy resource (DER) cybersecurity state-of-the-art by (a) bolstering industry awareness of cybersecurity concepts, risks, and solutions through a webinar series and (b) developing recommendations for DER cybersecurity standards to improve the security performance of DER products and networks. Drafting DER standards is a lengthy, consensus-based process requiring effective leadership and stakeholder participation. This project was designed to reduce standard and guide writing times by creating well-researched recommendations that could act as a starting place for national and international standards development organizations. Working within the SunSpec/Sandia DER Cybersecurity Workgroup, the team produced guidance for DER cybersecurity certification, communication protocol standards, network architecture s, access control, and patching. The team also led subgroups within the IEEE P 1547.3 Guide for Cybersecurity of Distributed Energy Resources Interconnected with Electric Power Systems committee and pushed a draft to ballot in October 2021.
Ninad, Nayeem; Apablaza-Arancibia, Estefan; Bui, Michel; Johnson, Jay
As more countries seek solutions to their de-carbonization targets using renewable energy (RE) technologies, interconnection standards and national grid codes for distributed energy resources (DER) are being updated to support higher penetrations of RE and improve grid stability. Common grid-code revisions mandate DER devices, such as solar inverters and energy storage systems, ride-through (RT) voltage and frequency disturbances. This is necessary because as the percentage of generation from DER increases, there is a greater risk power system faults will cause many or all DER to trip, triggering a substantial load-generation imbalance and possible cascading blackout. This paper demonstrates for the first time a methodology to verify commercial DER devices are compliant to new voltage, frequency, and rate of change of frequency (ROCOF) RT requirements established in IEEE Std. 1547-2018. The methodology incorporates a software automation tool, called the SunSpec System Validation Platform (SVP), in combination with a hardware-in-the-loop (HIL) system to execute the IEEE Std. 1547.1-2020 RT test protocols. In this paper, the approach is validated with two commercial photovoltaic inverters, the test results are analyzed for compliance, and improvements to the test procedure are suggested.
There are now over 2.5 million Distributed Energy Resource (DER) installations connected to the U.S. power system. These installations represent a major portion of American electricity critical infrastructure and a cyberattack on these assets in aggregate would significantly affect grid operations. Virtualized Operational Technology (OT) equipment has been shown to provide practitioners with situational awareness and better understanding of adversary tactics, techniques, and procedures (TTPs). Deploying synthetic DER devices as honeypots and canaries would open new avenues of operational defense, threat intelligence gathering, and empower DER owners and operators with new cyber-defense mechanisms against the growing intensity and sophistication of cyberattacks on OT systems. Well-designed DER canary field deployments would deceive adversaries and provide early-warning notifications of adversary presence and malicious activities on OT networks. In this report, we present progress to design a high-fidelity DER honeypot/canary prototype in a late-start Laboratory Directed Research and Development (LDRD) project.
While computer systems, software applications, and operational technology (OT)/Industrial Control System (ICS) devices are regularly updated through automated and manual processes, there are several unique challenges associated with distributed energy resource (DER) patching. Millions of DER devices from dozens of vendors have been deployed in home, corporate, and utility network environments that may or may not be internet-connected. These devices make up a growing portion of the electric power critical infrastructure system and are expected to operate for decades. During that operational period, it is anticipated that critical and noncritical firmware patches will be regularly created to improve DER functional capabilities or repair security deficiencies in the equipment. The SunSpec/Sandia DER Cybersecurity Workgroup created a Patching Subgroup to investigate appropriate recommendations for the DER patching, holding fortnightly meetings for more than nine months. The group focused on DER equipment, but the observations and recommendations contained in this report also apply to DERMS tools and other OT equipment used in the end-to-end DER communication environment. The group found there were many standards and guides that discuss firmware lifecycles, patch and asset management, and code-signing implementations, but did not singularly cover the needs of the DER industry. This report collates best practices from these standards organizations and establishes a set of best practices that may be used as a basis for future national or international patching guides or standards.
The sophistication and regularity of power system cybersecurity attacks has been growing in the last decade, leading researchers to investigate new innovative, cyber-resilient tools to help grid operators defend their networks and power systems. One promising approach is to apply recent advances in deep reinforcement learning (DRL) to aid grid operators in making real-time changes to the power system equipment to counteract malicious actions. While multiple transmission studies have been conducted in the past, in this work we investigate the possibility of defending distribution power systems using a DRL agent who has control of a collection of utility-owned distributed energy resources (DER). A game board using a modified version of the IEEE 13-bus model was simulated using OpenDSS to train the DRL agent and compare its performance to a random agent, a greedy agent, and human players. Both the DRL agent and the greedy approach performed well, suggesting a greedy approach can be appropriate for computationally tractable system configurations and a DRL agent is a viable path forward for systems of increased complexity. This work paves the way to create multi-player distribution system control games which could be designed to defend the power grid under a sophisticated cyber-attack.
The American distributed energy resource (DER) interconnection standard, IEEE Std. 1547, was updated in 2018 to include standardized interoperability functionality. As state regulators begin ratifying these requirements, all DER - such as photovoltaic (PV) inverters, energy storage systems (ESSs), and synchronous generators - in those jurisdictions must include a standardized SunSpec Modbus, IEEE 2030.5, or IEEE 1815 (DNP3) communication interface. Utilities and authorized third parties will interact with these DER interfaces to read nameplate information, power measurements, and alarms as well as configure the DER settings and grid-support functionality. In 2020, the certification standard IEEE 1547.1 was revised with test procedures for evaluating the IEEE 1547-2018 interoperability requirements. In this work, we present an open-source framework to evaluate DER interoperability. To demonstrate this capability, we used four test devices: a SunSpec DER Simulator with a SunSpec Modbus interface, an EPRI-developed DER simulator with an IEEE 1815 interface, a Kitu Systems DER simulator with an IEEE 2030.5 interface, and an EPRI IEEE 2030.5-to-Modbus converter. By making this test platform openly available, DER vendors can validate their implementations, utilities can spot check communications to DER equipment, certification laboratories can conduct type testing, and research institutions can more easily research DER interoperability and cybersecurity. We indicate several limitations and ambiguities in the communication protocols, information models, and the IEEE 1547.1-2020 test protocol which were exposed in these evaluations in anticipation that the standards-development organizations will address these issues in the future.
Cybersecurity for internet - connected Distributed Energy Resources (DER) is essential for the safe and reliable operation of the US power system. Many facets of DER cybersecurity are currently being investigated within different standards development organizations, research communities, and industry committees to address this critical need. This report covers DER access control guidance compiled by the Access Controls Subgroup of the SunSpec/Sandia DER Cybersecurity Workgroup. The goal of the group was to create a consensus - based technical framework to minimize the risk of unauthorized access to DER systems. The subgroup set out to define a strict control environment where users are authorized to access DER monitoring and control features through three steps: (a) user is identified using a proof-of-identity, (b) the user is authenticated by a managed database, (c) and the user is authorized for a specific level of access. DER access control also provides accountability and nonrepudiation within the power system control environment that can be used for forensic analysis and attribution in the event of a cyber-attack. This paper covers foundational requirements for a DER access control environment as well as offering a collection of possible policy, model, and mechanism implementation approaches for IEEE 1547-mandated communication protocols.
As the US electrifies the transportation sector, cyber attacks targeting vehicle charging could bring consequences to electrical system infrastructure. This is a growing area of concern as charging stations increase power delivery and must communicate to a range of entities to authorize charging, sequence the charging process, and manage load (grid operators, vehicles, OEM vendors, charging network operators, etc.). The research challenges are numerous and are complicated because there are many end users, stakeholders, and software and equipment vendors interests involved. Poorly implemented electric vehicle supply equipment (EVSE), electric vehicle (EV), or grid communication system cybersecurity could be a significant risk to EV adoption because the political, social, and financial impact of cyberattacks - or public perception of such - ripples across the industry and has lasting and devastating effects. Unfortunately, there is no comprehensive EVSE cybersecurity approach and limited best practices have been adopted by the EV/EVSE industry. There is an incomplete industry understanding of the attack surface, interconnected assets, and unsecured interfaces. Thus, comprehensive cybersecurity recommendations founded on sound research are necessary to secure EV charging infrastructure. This project is providing the power, security, and automotive industry with a strong technical basis for securing this infrastructure by developing threat models, determining technology gaps, and identifying or developing effective countermeasures. Specifically, the team is creating a cybersecurity threat model and performing a technical risk assessment of EVSE assets, so that automotive, charging, and utility stakeholders can better protect customers, vehicles, and power systems in the face of new cyber threats.
Increasing penetrations of interoperable distributed energy resources (DER) in the electric power system are expanding the power system attack surface. Maloperation or malicious control of DER equipment can now cause substantial disturbances to grid operations. Fortunately, many options exist to defend and limit adversary impact on these newly-created DER communication networks, which typically traverse the public internet. However, implementing these security features will increase communication latency, thereby adversely impacting real-time DER grid support service effectiveness. In this work, a collection of software tools called SCEPTRE was used to create a co-simulation environment where SunSpec-compliant photovoltaic inverters were deployed as virtual machines and interconnected to simulated communication network equipment. Network segmentation, encryption, and moving target defence security features were deployed on the control network to evaluate their influence on cybersecurity metrics and power system performance. The results indicated that adding these security features did not impact DER-based grid control systems but improved the cybersecurity posture of the network when implemented appropriately.
Grid operators are now considering using distributed energy resources (DERs) to provide distribution voltage regulation rather than installing costly voltage regulation hardware. DER devices include multiple adjustable reactive power control functions, so grid operators have the difficult decision of selecting the best operating mode and settings for the DER. In this work, we develop a novel state estimation-based particle swarm optimization (PSO) for distribution voltage regulation using DER-reactive power setpoints and establish a methodology to validate and compare it against alternative DER control technologies (volt-VAR (VV), extremum seeking control (ESC)) in increasingly higher fidelity environments. Distribution system real-time simulations with virtualized and power hardware-in-the-loop (PHIL)-interfaced DER equipment were run to evaluate the implementations and select the best voltage regulation technique. Each method improved the distribution system voltage profile; VV did not reach the global optimum but the PSO and ESC methods optimized the reactive power contributions of multiple DER devices to approach the optimal solution.
Under its Grid Modernization Initiative, the U.S. Department of Energy (DOE), in collaboration with energy industry stakeholders developed a multi-year research plan to support modernizing the electric grid. One of the foundational projects for accelerating modernization efforts is information and communications technology interoperability. A key element of this project has been the development of a methodology for engaging ecosystems related to grid integration to create roadmaps that advance the ease of integration of related smart technology. This document is the product of activities undertaken in 2017 through 2019. It provides a Cybersecurity Plan describing the technology to be adopted in the project with details as per the GMLC Call document.
Increasing solar energy penetrations may create challenges for distribution system operations because production variability can lead to large voltage deviations or protection system miscoordination. Instituting advanced management systems on distribution systems is one promising method for combating these challenges by intelligently controlling distribution assets to regulate voltage and ensure protection safety margins. While it is generally not the case today, greater deployment of power system sensors and interoperable distributed energy resources (DER)e.g., photovoltaic (PV) inverters, energy storage systems (ESS), electric vehicles (EVs)will enable situational awareness, control, and optimization of distribution systems. In this work, a control system was created which measures power system parameters to estimate the status of a feeder, forecasts the distribution state over a short-term horizon, and issues optimal set point commands to distribution-connected equipment to regulate voltage and protect the system. This two-year project integrated multiple research innovations into a management system designed to safely allow PV penetrations of 50% or greater. The integrated software was demonstrated through extensive real-time (RT) and power hardware-in-the-loop studies and a field demonstration on a live power system with a 684 kVA PV system.
To ensure reliable and predictable service in the electrical grid between distributed renewable distributed energy resources (DERs) it is important to gauge the level of trust present within critical components and DER aggregators (DERAs). Although trust throughout a smart grid is temporal and dynamically varies according to measured states, it is possible to accurately formulate communications and service level strategies based on such trust measurements. Utilizing an effective set of machine learning and statistical methods, it is shown that establishment of trust levels between DERAs using behavioral pattern analysis is possible. Further, it is also shown that the establishment of such trust can facilitate simple secure communications routing between DERAs. Providing secure routing between DERAs enables a grid operator to maintain service level agreements to its customers, reduce the attack surface and increase operational resiliency.
Extensive deployment of interoperable distributed energy resources (DER) is increasing the power system cyber security attack surface. National and jurisdictional interconnection standards require DER to include a range of autonomous and commanded grid-support functions, which can drastically influence power quality, voltage, and bulk system frequency. Here, the authors investigate the impact to the cyber-physical power system in scenarios where communications and operations of DER are controlled by an adversary. The findings show that each grid-support function exposes the power system to distinct types and magnitudes of risk. The physical impact from cyber actions was analysed in cases of DER providing distribution system voltage regulation and transmission system support. Finally, recommendations are presented for minimising the risk using engineered parameter limits and segmenting the control network to minimise common-mode vulnerabilities.
Under its Grid Modernization Initiative, the U.S. Department of Energy(DOE),in collaboration with energy industry stakeholders developed a multi-year research plan to support modernizing the electric grid. One of the foundational projects for accelerating modernization efforts is information and communications technology interoperability. A key element of this project has been the development of a methodology for engaging ecosystems related to grid integration to create roadmaps that advance the ease of integration of related smart technology. This document is the product of activities undertaken in 2017 through 2019.It provides a Cybersecurity Plan describing the technology to be adopted in the project with details as per the GMLC Call document.
An increasing number of public utility commissions are adopting Distributed Energy Resource (DER) interconnection standards which require photovoltaic (PV) inverters, energy storage systems, and other DER to include interoperable grid-support functionality. The recently updated national standard, IEEE 1547-2018, requires all DER to include a Sun Spec Modbus, IEEE 2030.5, or IEEE 1815 communication interface in order to provide local and bulk power system services. Those communication protocols and associated information models will ensure system interoperability for PV and storage systems, but these new utility-to-DER communication networks must be deployed with sufficient cybersecurity to protect the U.S. power system and other critical infrastructure reliant on dependable power. Unlike bulk generators, DER are commonly connected to grid operators via public internet channels. These DER networks are exposed to a large attack surface that may leverage sophisticated techniques and infrastructure developed on IT systems, including remote exploits and distributed attacks. Although DER make up a growing portion of the national generation mix, they have limited processing capabilities and do not typically support modern security features such as encryption or authentication. In this work, Sandia National Laboratories constructed simulated DER communication net- works with a range of security features in order to study the security posture of different communication approaches. The experimental test environment was created in a Sandia-developed co-simulation platform, called SCEPTRE, which emulated Sun Spec-compliant DER equipment, the utility DER management system, communication network, and distribution power system. Adversary-based assessments were conducted and a quantitative scoring criteria was applied to evaluate the resilience of various architectures against cyber attacks and to measure the systemic impact during such attacks. The team found that network segmentation, encryption, and moving target defense improved the security of these networks and would be recommended for utility, aggregator, and local DER networks.
Recently developed Distributed Energy Resource (DER) interoperability standards include communication and cyber security requirements. In 2018, the US national interconnection standard, IEEE 1547, was revised to require DER to include a Sun Spec Modbus, IEEE 2030.5 (Smart Energy Profile, SEP 2.0), or IEEE 1815 (DNP3) communication interface but does not include any normative overarching cybersecurity requirements. IEEE 2030.5 and associated implementation requirements for California, known as the California Smart Inverter Profile (CSIP), prescribe the greatest security features - including encryption, authentication, and key management requirements. Sun Spec Modbus and IEEE 1815 security requirements are not as comprehensive, leading to implementation questions throughout the industry. Further, while the security features in IEEE 2030.5 are commonly used in computing platforms, there are still questions of how well the technologies will scale in highly-distributed, computationally-limited inverter environments. In this paper, (a) the elements of IEEE 2030.5 encryption, authentication, and key management guidelines are analyzed, (b) potential scalability gaps are identified, and (c) alternative technologies are explored for possible inclusion in DER interoperability or cyber security standards.
An increasing number of jurisdictions are adopting Distributed Energy Resource (DER) interconnection standards which require photovoltaic (PV) inverters, energy storage systems, and other DER to include interoperable grid-support functionality. These functions provide grid operators the nobs to support local and bulk power system operations with DER equipment, but the associated grid operator-to-DER communications networks must be deployed with appropriate cybersecurity features. In some situations, additional security features may prevent control system scalability or increase communication latencies and dropouts. These unintentional consequences of the security features would therefore hinder the ability of the grid operator to implement specific control algorithms. This project evaluated the tradeoffs between power system performance and cybersecurity metrics for several grid services.
An increasing number of jurisdictions are adopting Distributed Energy Resource (DER) interconnection standards which require photovoltaic (PV) inverters, energy storage systems, and other DER to include interoperable grid-support functionality. These functions provide grid operators the nobs to support local and bulk power system operations with DER equipment, but the associated grid operator-to-DER communications networks must be deployed with appropriate cybersecurity features. In some situations, additional security features may prevent control system scalability or increase communication latencies and dropouts. These unintentional consequences of the security features would therefore hinder the ability of the grid operator to implement specific control algorithms. This project evaluated the tradeoffs between power system performance and cybersecurity metrics for several grid services. This was conducted in two parts.
Extensive deployment of interoperable distributed energy resources (DER) on power systems is increasing the power system cyber security attack surface. National and jurisdictional interconnection standards require DER to include a range of autonomous and commanded grid support functions which can drastically influence power quality, voltage, and bulk system frequency. This project was split into two phases. The first provided a survey and roadmap of the cybersecurity for the solar industry. The second investigated multiple PV cybersecurity research and development (R&D) concepts identified in the first phase. In the first year, the team created a roadmap for improving cybersecurity for distributed solar energy resources. This roadmap was intended to provide direction for the nation over the next five years and focused on the intersection of industry and government and recommends activities in four related areas: stakeholder engagement, cyber security research and development, standards development, and industry best practices. At the same time, the team produced a primer for DER vendors, aggregators, and grid operators to establish a common taxonomy and describe basic principles of cyber security, encryption, communication protocols, DER cyber security recommendations and requirements, and device-, aggregator-, and utility-level security best practices to ensure data confidentiality, integrity, and availability. This material was motivated by the need to assist the broader PV industry with cybersecurity resilience and describe the state-of-the-art for securing DER communications. Lastly, an adversary-based assessment of multiple PV devices was completed at the Distributed Energy Technologies Laboratory at Sandia National Laboratories to determine the status of industry cybersecurity practices. The team found multiple deficiencies in the security features of the assessed devices. In the second year, a set of recommendations was created for DER communication protocols— especially with respect to the state-of-the-art requirements in IEEE 2030.5. Additionally, several cybersecurity R&D technologies related to communications-enabled photovoltaic systems were studied to harden DER communication networks. Specifically, the team investigated (a) using software defined networking to create a moving target defense system for DER communications, and (b) engineering controls that prevent misprogramming or adversary action on DER devices/networks by disallowing setpoints that will generate unstable power system operations.
This paper focuses on a transmission system with a high penetration of converter-interfaced generators participating in its primary frequency regulation. In particular, the effects on system stability of widespread misconfiguration of frequency regulation schemes are considered. Failures in three separate primary frequency control schemes are analyzed by means of time domain simulations where control action was inverted by, for example, negating controller gain. The results indicate that in all cases the frequency response of the system is greatly deteriorated and, in multiple scenarios, the system loses synchronism. It is also shown that including limits to the control action can mitigate the deleterious effects of inverted control configurations.
While the concept of aggregating and controlling renewable distributed energy resources (DERs) to provide grid services is not new, increasing policy support of DER market participation has driven research and development in algorithms to pool DERs for economically viable market participation. Sandia National Laboratories recently undertook a three-year research program to create the components of a real-world virtual power plant (VPP) that can simultaneously participate in multiple markets. Our research extends current state-of-the-art rolling horizon control through the application of stochastic programming with risk aversion at various time resolutions. Our rolling horizon control consists of (1) day-ahead optimization to produce an hourly aggregate schedule for the VPP operator and (2) sub-hourly optimization for real-time dispatch of each VPP subresource. Both optimization routines leverage a two-stage stochastic program (SP) with risk aversion, and integrate the most up-to-date forecasts to generate probabilistic scenarios in real operating time. Our results demonstrate the benefits to the VPP operator of constructing a stochastic solution regardless of the weather. In more extreme weather, applying risk optimization strategies can dramatically increase the financial viability of the VPP. As a result, the methodologies presented here can be further tailored for optimal control of any VPP asset fleet and its operational requirements.
Cybersecurity is essential for interoperable power systems and transportation infrastructure in the US. As the US transitions to transportation electrification, cyber attacks on vehicle charging could impact nearly all US critical infrastructure. This is a growing area of concern as more charging stations communicate to a range of entities (grid operators, vehicles, OEM vendors, etc.), as shown in Figure I.1.1.1. The research challenges are extensive and complicated because there are many end users, stakeholders, and software and equipment vendors. Poorly implemented electric vehicle supply equipment (EVSE) cybersecurity is a major risk to electric vehicle (EV) adoption because the political, social, and financial impact of cyberattacks—or public perception of such—ripples across the industry and has lasting and devastating effects. Unfortunately, there is no comprehensive EVSE cybersecurity approach and limited best practices have been adopted by the EV/EVSE industry. For this reason, there is an incomplete industry understanding of the attack surface, interconnected assets, and unsecured interfaces. Thus, comprehensive cybersecurity recommendations founded on sound research are necessary to secure EV charging infrastructure. This project is providing the automotive industry with a strong technical basis for securing this infrastructure by developing threat models, prioritizing technology gaps, and developing effective countermeasures. Specifically, the team is creating a cybersecurity threat model and performing a technical risk assessment of EVSE assets, so that automotive, charging, and utility stakeholders can better protect customers, vehicles, and power systems in the face of new cyber threats.
Grid operators are increasingly turning to advanced grid-support functions in distributed energy resources (DER) to assist with distribution circuit voltage regulation, bulk system frequency control, and power system protection. The U.S. DER certification standard, Underwriters Laboratories (UL) 1741, was revised in September 2016 to add test procedures for multiple grid-support functions. Sandia National Laboratories, SunSpec Alliance, and growing community of collaborators have undertaken a multiyear effort to create an open-source system validation platform (SVP) that automates DER interconnection and interoperability test procedures by communicating with grid simulators, photovoltaic (PV) simulators, data acquisition systems, and interoperable equipment under test. However, the power hardware required for generating the test conditions may be untenable for many organizations. Herein, we discuss development of the SVP testing capabilities for UL 1741 tests utilizing a controller hardware-in-The-loop testbed that precludes the need for power hardware using a 34.5 kW Austrian Institute of Technology smart grid controller. Analysis of normal ramp rate, soft start ramp rate, specified power factor, volt-VAr, and frequency-watt advanced grid functions, and the effectiveness of the UL 1741 test protocols are included.
Cyber-secure, resilient energy is paramount to the prosperity of the United States. As the experience and sophistication of cyber adversaries grow, so too must the US power system’s defenses, situational awareness, and response and recovery strategies. Traditionally, power systems were operated with dedicated communication channels to large generators and utility-owned assets but now there is greater reliance on photovoltaic (PV) systems to provide power generation. PV systems often communicate to utilities, aggregators, and other grid operators over the public internet so the power system attack surface has significantly expanded. At the same time, solar energy systems are equipped with a range of grid-support functions, that—if controlled or programmed improperly—present a risk of power system disturbances. This document is a five-year roadmap intended to chart a path for improving cyber security for communication-enabled PV systems with clear roles and responsibilities for government, standards development organizations, PV vendors, and grid operators.
This report provides an introduction to cyber security for distributed energy resources (DER) - such as photovoltaic (PV) inverters and energy storage systems (ESS). This material is motivated by the need to assist DER vendors, aggregators, grid operators, and broader PV industry with cyber security resilience and describe the state-of-the-art for securing DER communications. The report outlines basic principles of cyber security, encryption, communication protocols, DER cyber security recommendations and requirements, and device-, aggregator-, and utility-level security best practices to ensure data confidentiality, integrity, and availability. Example cyber security attacks, including eavesdropping, masquerading, man-in-the-middle, replay attacks, and denial-of-service are also described. A survey of communication protocols and cyber security recommendations used by the DER and power system industry are included to elucidate the cyber security standards landscape. Lastly, a roadmap is presented to harden end-to-end communications for DER with research and industry engagement.