Volatile Geopolitics Shake the Trends of the 2022 Cybersecurity Threat Landscape

With the geopolitical context giving rise to cyberwarfare and hacktivism, alarming cyber operations and malignant cyberattacks have altered the trends of the 10th edition of the Threat Landscape report released by the European Union Agency for Cybersecurity (ENISA).

The ENISA Threat Landscape 2022 (ETL) report is the annual report of the EU Agency for Cybersecurity on the state of the cybersecurity threat landscape. The 10th edition covers a period of reporting starting from July 2021 up to July 2022.

With more than 10 terabytes of data stolen monthly, ransomware still fares as one of the prime threats in the new report with phishing now identified as the most common initial vector of such attacks. The other threats to rank highest along ransomware are attacks against availability also called Distributed Denial of Service (DDoS) attacks.

However, the geopolitical situations particularly the Russian invasion of Ukraine have acted as a game changer over the reporting period for the global cyber domain. While we still observe an increase of the number of threats, we also see a wider range of vectors emerge such as zero-day exploits and AI-enabled disinformation and deepfakes. As a result, more malicious and widespread attacks emerge having more damaging impact.

EU Agency for Cybersecurity Executive Director, Juhan Lepassaar stated that “Today's global context is inevitably driving major changes in the cybersecurity threat landscape. The new paradigm is shaped by the growing range of threat actors. We enter a phase which will need appropriate mitigation strategies to protect all our critical sectors, our industry partners and therefore all EU citizens."

Prominent threat actors remain the same

State sponsored, cybercrime, hacker-for-hire actors and hacktivists remain the prominent threat actors during the reporting period of July 2021 to July 2022.

Based on the analysis of the proximity of cyber threats in relation to the European Union (EU), the number of incidents remains high over the reporting period in the NEAR category. This category includes affected networks, systems, controlled and assured within EU borders. It also covers the affected population within the borders of the EU.

Threat analysis across sectors

Added last year, the threat distribution across sectors is an important aspect of the report as it gives context to the threats identified. This analysis shows that no sector is spared. It also reveals nearly 50% of threats target the following categories; public administration and governments (24%), digital service providers (13%) and the general public (12%) while the other half is shared by all other sectors of the economy.

Top threats still standing their grounds

ENISA sorted threats into 8 groups. Frequency and impact determine how prominent all of these threats still are.

Ransomware:
- 60% of affected organisations may have paid ransom demands
Malware:
- 66 disclosures of zero-day vulnerabilities observed in 2021
Social engineering:
- Phishing remains a popular technique but we see new forms of phishing arising such as spear-phishing, whaling, smishing and vishing
Threats against data:
- Increasing in proportionally to the total of data produced
Threats against availability:
- Largest Denial of Service (DDoS) attack ever was launched in Europe in July 2022;
- Internet: destruction of infrastructure, outages and rerouting of internet traffic.
Disinformation – misinformation:
- Escalating AI-enabled disinformation, deepfakes and disinformation-as-a-service
Supply chain targeting:
- Third-party incidents account for 17% of the intrusions in 2021 compared to less than 1% in 2020

Contextual trends emerging

- Zero-day exploits are the new resource used by cunning threat actors to achieve their goals;
- A new wave of hacktivism has been observed since the Russia-Ukraine war.
- DDoS attacks are getting larger and more complex moving towards mobile networks and Internet of Things (IoT) which are now being used in cyberwarfare.
- AI-enabled disinformation and deepfakes. The proliferation of bots modelling personas can easily disrupt the “notice-and-comment” rulemaking process, as well as the community interaction, by flooding government agencies with fake contents and comments.

Shifting motivation and digital impact are driving new trends

An impact assessment of threats reveals 5 types of impact; damages of reputational, digital, economical, physical or social nature. Although for most incidents the impact really remains unknown because victims fail to disclose information or the information remains incomplete.

Prime threats were analysed in terms of motivation. The study reveals that ransomware is purely motivated by financial gains. However, motivation for state sponsored groups can be drawn from geopolitics with threats such as espionage and disruptions. Ideology may also be the motor behind cyber operations by hacktivists.

Chemical security experts call for multisector cooperation against terrorism

The devastating impact of chemical weapons and explosives used in acts of terrorism continues to affect civilian populations and is well known for its destructive and long-term harm.

Last year over 1,000 improvised explosive device (IED) attacks were conducted by non-state actors, injuring over 7,150 people in more than 40 countries. Many attacks come from chemicals that criminals acquired through weak points in the supply chain – from manufacturing to storage and retail– and made into weapons.

To counter this threat, some 220 chemical security practitioners from more than 70 countries met at INTERPOL’s 3rd Global Congress on Chemical Security and Emerging Threats to find ways of reducing vulnerabilities by enhancing multisector cooperation and collaboration.

With a focus on acquisition, transportation, physical and cyber security of chemical materials, the meeting highlighted a range of security issues, such as detecting cross-border movements of regulated material and implementing regulatory frameworks.

Terrorists’ misuse of e-commerce and new technologies

The Global Congress also explored ways to counter emerging threats including terrorists’ misuse of e-commerce and new technologies to acquire toxic and precursor chemicals.

Due to the substantial growth and access to the Internet in recent years, so too we have seen an increase in digital content produced and shared through platforms such as instant messaging, social networking, blogs and online portals. The misuse of technologies can be seen as a result of this rapid growth in content, and with it a rise in suspicious activities.

Law enforcement agencies provided examples of investigative techniques that could be used to identify and prosecute the illicit purchase or sale of chemicals on the Dark Net. These lessons provided delegates with solutions to address the use of sophisticated technologies for nefarious purposes.

"The concerted effort of global law enforcement, along with our partners, is key to combatting the use of explosive precursor chemicals and chemical weapons,” Mr Hinds added.

Dual-use and precursor chemicals have a wide legitimate function in the production of consumer goods such as pharmaceuticals, cleaning supplies and fertilizers. This raises significant challenges to prevent and monitor, and remains one of the inherent threats to chemical security worldwide.

INTERPOL awareness video - ‘The Watchmaker’

In this context, an INTERPOL-produced awareness video was premiered at the meeting to engage a broad spectrum of stakeholders in understanding the importance of individuals and companies to secure dangerous toxic chemicals, including equipment.

Entitled ‘The Watchmaker’, the video highlights the need for multisector cooperation to combat these threats and will be used in a series of INTERPOL capacity building workshops and other activities related to counter-terrorism and prevention.

“Multisector collaboration is essential for us to tackle the threats we face from criminals who gain access to dangerous chemicals with malevolent intentions. Morocco is committed to strengthening the engagement of these issues as part of our proactive approach to combating terrorism,” said Mr. Mohammed Dkhissi, Head of National Central Bureau, Rabat.

Other measures proposed by the Global Congress Network include:

- Advocating chemical security recommendations such as increased retail reporting on suspicious activity;
- Expanding the INTERPOL-hosted Global Knowledge Hub, which allows members to engage in interactive discussions and access good practice guidance;
- Strengthening the Global Congress Network through greater diversity of expertise and activities across regions and sectors;
- Promoting decision-making tools such as a customer database, which can flag areas of security concern.

Since its inception in 2018, the Global Congress has been jointly led by INTERPOL, the US Cybersecurity and Infrastructure Security Agency (CISA), the US Defense Threat Reduction Agency (DTRA) and the US Federal Bureau of Investigation (FBI), and implemented in cooperation with the G7 Global Partnership Against the Spread of Weapons and Materials of Mass Destruction.

ITU Emergency Telecom Roster helps restore connectivity after hurricane hits Nicaragua

A powerful tropical hurricane ripped across Nicaragua earlier this month, with torrential rains triggering life-threatening flash floods and mudslides across the Central American country.

The Category 1 storm forced 13,000 people to evacuate to shelters, according to some reports – many with only the clothes on their backs.

“The river rose one metre in ten minutes,” according to eyewitness José Domingo Enríquez of the interior town El Rama, one of the worst-affected. “It was clear the flood was coming fast, and we had to find a way to evacuate.”

Critical electricity and telecommunications services were cut shortly after the storm made landfall, leaving a million people in the dark and worried about their loved ones’ safety.

Emergency Telecom Roster deploys

To help close connectivity gaps and bolster disaster response efforts in some of the country’s hardest-hit areas, two members of ITU’s Emergency Telecommunications Roster (ETR), a group of staff volunteers from across the organization, were deployed to Nicaragua.

Their mission – the first since the roster was created – was two-fold: deliver 10 Iridium satellite phones and 10 Inmarsat Broadband Global Area Network (BGAN) terminals to help restore connectivity as soon as possible, and to provide training for local teams to use the equipment.

ITU will typically deploy equipment upon request from an ITU Member State following a natural hazard, after which the team aims to respond within 24 to 48 hours.

In Nicaragua’s case, the request came via the telecom regulator, TELCOR, and SINAPRED, the country’s national disaster management agency.

Once on the ground, roster members Mario Castro Grande and Hani Alser met with government officials to deliver the equipment, train Telcor and SINAPRED responders, and assess the damage.

According to Alser, local officials were extremely welcoming and highly appreciative of both the equipment and the expertise provided.

“Having at least one technical person and another that can communicate in the local language and knows the customs is key to a successful ETR mission,” added Castro Grande.

Beyond bringing equipment

Delivering critical emergency telecom equipment is only part of ITU’s work in this domain.

The UN agency for information and communication technologies (ICTs) also supports the development and implementation of National Emergency Telecommunication Plans (NETP) among other regulatory and legal disaster preparedness frameworks.

“Nicaragua had a draft NETP back in 2014, but apparently it was shelved,” explained Castro Grande. “Our mission also served as a timely reminder that they should look at it again, with the objective of finalizing it.”

The ITU team also urged national authorities to implement an early warning system. This was another aspect of the mission, said Castro Grande. “We offered some information on appropriate available systems for developing countries, such as cell broadcasting, and informed them on legislative models they could look at.”

The ability of cell broadcast technology to push messages without being affected by traffic load makes it useful during emergencies when data traffic spikes, and regular SMS and voice calls tend to congest mobile networks.

“About 95 per cent of the global population is covered by a broadband network, with 5.7 billion mobile subscriptions, meaning at least 70 per cent of the world is connected,” Castro Grande pointed out. “Cell broadcasting technology should be used to its fullest potential to warn people ahead of disaster.”

Earlier this year, Secretary-General Antonio Guterres announced the United Nations would “spearhead new action to ensure every person on Earth is protected by early warning systems within five years.” ITU is supporting this initiative, which is led by the World Meteorological Organization (WMO).

NCSC CEO delivers international speech on securing the Internet of Things and smart cities

The head of the UK’s National Cyber Security Centre, Lindy Cameron, has emphasised the importance of connected technologies being made secure by design in a speech at Singapore International Cyber Week.

Lindy Cameron said the growth of the Internet of Things (IoT) has brought benefits for consumers, enterprises and at a city level in connected places, but she highlighted that the associated risks must be managed now to stay ahead of cyber threats.

She outlined how the UK has developed a strong framework for managing the future security of the Internet of Things, including through new legislation, the adoption of international cyber security standards and developing ‘secure by design’ principles to help influence IoT at the design phase.

She called for swift, decisive and ongoing action to ensure connected devices are designed, built, deployed and managed with security as a first-class concern, to prevent malicious actors, improve national resilience and reap benefits of these emerging technologies

ESF Partners, NSA, and CISA Release Software Supply Chain Guidance for Suppliers

The National Security Agency (NSA), the Cybersecurity and Infrastructure Security Agency (CISA), and Office of the Director of National Intelligence (ODNI) released Securing the Software Supply Chain: Recommended Practices Guide for Suppliers. The product is through the Enduring Security Framework (ESF) — a public-private cross-sector working group led by NSA and CISA that provides cybersecurity guidance to address high priority threats to the nation’s critical infrastructure.

In an effort to provide guidance to suppliers, ESF examined the events that led up to the SolarWinds attack, which made clear that investment was needed to create a set of industry and government evaluated best practices focusing on the needs of the software supplier.

Cyberattacks target an enterprise’s use of cyberspace to disrupt, disable, destroy, or maliciously control a computing environment or infrastructure, destroy the integrity of data, or steal controlled information. A malicious actor can take advantage of a single vulnerability in the software supply chain and have a severe negative impact on computing environments or infrastructure.

Prevention is often seen as the responsibility of the software developer, as they are required to securely develop and deliver code, verify third party components, and harden the build environment. But the supplier also holds a critical responsibility in ensuring the security and integrity of our software. After all, the software vendor is responsible for liaising between the customer and software developer. It is through this relationship that additional security features can be applied via contractual agreements, software releases and updates, notifications and mitigations of vulnerabilities.

Software suppliers will find guidance from NSA and our partners on preparing organizations by defining software security checks, protecting software, producing well-secured software, and responding to vulnerabilities on a continuous basis. Until all stakeholders seek to mitigate concerns specific to their area of responsibility, the software supply chain cycle will be vulnerable and at risk for potential compromise.

NSA Releases Guidance on How to Protect Against Software Memory Safety Issues

The National Security Agency (NSA) has published guidance to help software developers and operators prevent and mitigate software memory safety issues, which account for a large portion of exploitable vulnerabilities.

The “Software Memory Safety” Cybersecurity Information Sheet highlights how malicious cyber actors can exploit poor memory management issues to access sensitive information, promulgate unauthorized code execution, and cause other negative impacts.

“Memory management issues have been exploited for decades and are still entirely too common today,” said Neal Ziring, Cybersecurity Technical Director. “We have to consistently use memory safe languages and other protections when developing software to eliminate these weaknesses from malicious cyber actors.”

Microsoft and Google have each stated that software memory safety issues are behind around 70 percent of their vulnerabilities. Poor memory management can lead to technical issues as well, such as incorrect program results, degradation of the program’s performance over time, and program crashes.

NSA recommends that organizations use memory safe languages when possible and bolster protection through code-hardening defenses such as compiler options, tool options, and operating system configurations.

DOD Cybersecurity: Enhanced Attention Needed to Ensure Cyber Incidents Are Appropriately Reported and Shared

DOD and DIB information technology systems continue to be susceptible to cyber incidents as cybersecurity threats have evolved and become more sophisticated. Federal laws and DOD guidance emphasize the importance of properly reporting and sharing cyber incident information, as both are vital to identifying system weaknesses and improving the security of the systems.

House Report 116-442 included a provision for GAO to review DOD's cyber incident management. This report examines the extent to which DOD established and implemented a process to (1) report and notify leadership of cyber incidents, (2) report and share information about cyber incidents affecting the DIB, and (3) notify affected individuals of a PII breach.

To conduct this work, GAO reviewed relevant guidance, analyzed samples of cyber incident artifacts and cyber incident reports submitted by the DIB and privacy data breaches reported by DOD, and surveyed 24 DOD cyber security service providers. In addition, GAO interviewed officials from DOD and cyber security service providers and convened two discussion groups with DIB companies.

Cyber attacks threaten national security—but hackers continue to target DOD as well as private companies and others involved in the nation's military operations.

DOD has taken steps to combat these attacks and has reduced the number of cyber incidents in recent years. But we found that DOD:
- Hasn't fully implemented its processes for managing cyber incidents
- Doesn't have complete data on cyber incidents that staff report
- Doesn't document whether it notifies individuals whose personal data is compromised in a cyber incident

What GAO Found

The Department of Defense (DOD) and our nation's defense industrial base (DIB)—which includes entities outside the federal government that provide goods or services critical to meeting U.S. military requirements—are dependent on information systems to carry out their operations. These systems continue to be the target of cyber attacks, as DOD has experienced over 12,000 cyber incidents since 2015 (see figure).To combat these incidents, DOD has established two processes for managing cyber incidents—one for all incidents and one for critical incidents. However, DOD has not fully implemented either of these processes.

Despite the reduction in the number of incidents due to DOD efforts, weaknesses in reporting these incidents remain. For example, DOD's system for reporting all incidents often contained incomplete information and DOD could not always demonstrate that they had notified appropriate leadership of relevant critical incidents. The weaknesses in the implementation of the two processes are due to DOD not assigning an organization responsible for ensuring proper incident reporting and compliance with guidance, among other reasons. Until DOD assigns such responsibility, DOD does not have assurance that its leadership has an accurate picture of the department's cybersecurity posture.

In addition, DOD has not yet decided whether DIB cyber incidents detected by cybersecurity service providers should be shared with all relevant stakeholders, according to officials. DOD guidance states that to protect the interests of national security, cyber incidents must be coordinated among and across DOD organizations and outside sources, such as DIB partners. Until DOD examines whether this information should be shared with all relevant parties, there could be lost opportunities to identify system threats and improve system weaknesses.

DOD has established a process for determining whether to notify individuals of a breach of their personally identifiable information (PII). This process includes conducting a risk assessment that considers three factors—the nature and sensitivity of the PII, likelihood of access to and use of the PII, and the type of the breach. However, DOD has not consistently documented the notifications of affected individuals, because officials said notifications are often made verbally or by email and no record is retained. Without documenting the notification, DOD cannot verify that people were informed about the breach.

GAO is making six recommendations, including that DOD assign responsibility for ensuring proper incident reporting, improve the sharing of DIB-related cyber incident information, and document when affected individuals are notified of a PII breach. DOD concurred with the recommendations.

CISA Developed Cross-Sector Recommendations to Help Organizations Prioritize Cybersecurity Investments

The Department of Homeland Security released the Cybersecurity Performance Goals (CPGs), voluntary practices that outline the highest-priority baseline measures businesses and critical infrastructure owners of all sizes can take to protect themselves against cyber threats. The CPGs were developed by DHS, through the Cybersecurity and Infrastructure Security Agency (CISA), at the direction of the White House. Over the past year, CISA worked with hundreds of public and private sector partners and analyzed years of data to identify the key challenges that leave our nation at unacceptable risk. By clearly outlining measurable goals based on easily understandable criteria such as cost, complexity, and impact, the CPGs were designed to be applicable to organizations of all sizes. This effort is part of the Biden-Harris Administration’s ongoing work to ensure the security of the critical infrastructure and reduce our escalating national cyber risk.

“Organizations across the country increasingly understand that cybersecurity risk is not only a fundamental business challenge but also presents a threat to our national security and economic prosperity,” said Secretary of Homeland Security Alejandro N. Mayorkas. “The new Cybersecurity Performance Goals will help organizations decide how to leverage their cybersecurity investments with confidence that the measures they take will make a material impact on protecting their business and safeguarding our country.”

CISA developed the CPGs in close partnership with the National Institute for Standards and Technology (NIST). The resulting CPGs are intended to be implemented in concert with the NIST Cybersecurity Framework. Every organization should use the NIST Cybersecurity Framework to develop a rigorous, comprehensive cybersecurity program. The CPGs prescribe an abridged subset of actions – a kind of “QuickStart guide” – for the NIST CSF to help organizations prioritize their security investments.

“To reduce risk to the infrastructure and supply chains that Americans rely on every day, we must have a set of baseline cybersecurity goals that are consistent across all critical infrastructure sectors,” said CISA Director Jen Easterly. “CISA has created such a set of cybersecurity performance goals to address medium-to-high impact cybersecurity risks to our critical infrastructure. For months, we’ve been gathering input from our partners across the public and private sectors to put together a set of concrete actions that critical infrastructure owners can take to drive down risk to their systems, networks and data. We look forward to seeing these goals implemented over the coming years and to receiving additional feedback on how we can improve future versions to most effectively reduce cybersecurity risk to our country.”

“The Biden-Harris Administration has relentlessly focused on securing our Nation’s critical infrastructure since day one,” said Deputy National Security Advisor for Cyber and Emerging Technologies Anne Neuberger. “CISA has demonstrated tremendous leadership in strengthening our critical infrastructure’s cyber resilience over the last year. The Cyber Performance Goals build on these efforts, by setting a higher cybersecurity standard for sectors to meet.”

“Given the myriad serious cybersecurity risks our nation faces, NIST looks forward to continuing to work with industry and government organizations to help them achieve these performance goals,” said Under Secretary of Commerce for Standards and Technology and NIST Director Laurie E. Locascio. “Our priority remains bringing together the right stakeholders to further develop standards, guidelines and practices to help manage and reduce cybersecurity risk.”

In the months ahead, CISA will actively seek feedback on the CPGs from partners across the critical infrastructure community and has established a Discussions webpage to receive this input. CISA will also begin working directly with individual critical infrastructure sectors as it builds out sector-specific CPGs in the coming months.

To access these new CPGs visit CISA.gov/cpgs.

Designing a flood early warning system (FEWS) for West Africa

The great West African drought that started in the 1970s was undoubtedly a turning point in the region’s environmental discourse. It is well recognised as one of the most significant climate-driven disasters in recent history. The event was the onset of an era of rainfall uncertainty and variability, driving recurring floods and droughts across the region.

West Africa, an agglomeration of 16 countries, spans from the dense humid forests of the south to northern Saharan desertscapes (Figure.1). The region’s rainfall cycle is controlled by the Intertropical Convergence Zone. Changes in rainfall patterns have been attributed to climate change as well as land-use changes. ‘The Sahelian paradox’, is the increase in river flows despite reducing rainfall seen in many river basins. The complexity of hydrological and regional wind systems make it difficult to accurately predict long-term rainfall trends and their consequences.

The Economic Community of West African States (ECOWAS) has invested significantly in drought management in the past. However, these nations have been unprepared for the sudden rise in floods over the last decade. In 2020, a year of particular flood severity, 198,000 homes were destroyed or damaged, 96,000 people were displaced and 2.2 million people were affected across West and Central Africa. If no action is taken, an estimated 32 million people will be forced to migrate internally by 2050.

In response to increasingly frequent disasters, many early warning systems for floods have been launched in West Africa. Flood early warning systems are typically designed around four broad considerations: knowledge of risks, monitoring and warning, response capacity and communication. These systems monitor real-time atmospheric conditions to predict weather conditions, and warn people and governments on how and when to act to minimise disaster impacts. Such tools are especially effective when emergency action plans are laid out and agreed upon by different stakeholders.

Existing flood early warning systems (FEWS) have not been able to meet stakeholders’ needs regarding timeliness of information, geographical coverage, uninterrupted communication, accuracy and open ownership. To increase the adoption, effectiveness and usefulness of warning systems, stakeholder engagement in the design phase is crucial. Generally, empirical evidence on the effectiveness of participatory processes in sustainability science and disaster planning has been weak.

The EU Horizon 2020 FANFAR (Reinforced cooperation to provide operational flood forecasting and alerts in West Africa) project aimed to change this. Within FANFAR’s broader aim of developing a FEWS, our research focused on designing such a system in collaboration with 50-60 stakeholders from 17 countries. Stakeholders included emergency managers, representatives from regional and national hydrological services and river basin institutions. Two key participating organisations were the West African consortium members AGRHYMET Regional Center and the Nigeria Hydrological Services Agency.

We used a research approach called Multi-Criteria Decision Analysis (MCDA). MCDA helps find possible solutions in situations where multiple, often conflicting, criteria need to be considered when assessing options. The first research question investigated what a good FEWS looks like in the West African context.

The second and broader objective explored the relevance of using MCDA as a participatory and transdisciplinary approach for a large project potentially benefiting millions of people across several countries. The participatory process was designed around three key project phases and implemented through a series of stakeholder workshops (Figure 2).

During the first phase of co-designing, stakeholders developed a joint understanding of the problems of existing flood warning systems. They came to a consensus on objectives that were needed to prioritise functions in the warning system. The second phase focused on knowledge co-production, where scientific and societal perspectives and practitioners’ expertise from different sectors were integrated.

The aim of MCDA was to design a FEWS in a way that best meets the objectives and preferences of all stakeholders. During the final stage of co-dissemination and evaluation, the aim was to translate the knowledge produced into solution-oriented and scalable products.

From the co-designing phase, ten objectives emerged as fundamentally important to stakeholders, clustered into four groups. These were clarity and accuracy of information, reliable and timely information access, affordability of production development and operation, and long-term financial and operational sustainability of the early warning system (Figure 4). The ten objectives received different weights depending on their importance to stakeholders.

Of the eleven versions of FEWS that were created by stakeholders and the FANFAR consortium, three were assessed to be well-performing and robust. One version, for example, could function with relatively minimal resources such as poor internet connectivity, unstable power supply and a limited number of skilled personnel. It suggests the FEWS should be simple and robust rather than incorporating many complex features.

MCDA was particularly helpful in focusing on stakeholders’ values. It helped in navigating and reconciling conflicting stakeholder preferences. MCDA was also helpful for knowledge co-production by providing clarity on stakeholder preferences, incorporating diverse perspectives from different disciplines and assessing different FEWS versions despite uncertain data.

The uptake of the FANFAR FEWS in West Africa will depend on a multitude of other factors. These include operational data collection, strategies to increase local capacity, securing long-term funding for operations, maintenance, and technical development. Local and regional governance structures also play an important role. However, because it was built on a common understanding of contextual challenges among diverse stakeholders, we believe the resultant FEWS will be useful to stakeholders from different regions, sectors and professional backgrounds.

 

[Source: Lienert, J. et al. (2022) 'How to co-design a flood early warning system (FEWS) for West Africa' Water Science Policy, doi: https://dx.doi.org/10.53014/CBJJ5560]

Study uses AI to predict fragility of power grid networks - double trouble when 2 disasters strike electrical transmission infrastructure

One disaster can knock out electric service to millions. A new study suggests that back-to-back disasters could cause catastrophic damage, but the research also identifies new ways to monitor and maintain power grids.

Researchers at The Ohio State University have developed a machine learning model for predicting how susceptible overhead transmission lines are to damage when natural hazards like hurricanes or earthquakes happen in quick succession.

An essential facet of modern infrastructure, steel transmission towers help send electricity across long distances by keeping overhead power lines far off the ground. After severe damage, failures in these systems can disrupt networks across affected communities, taking anywhere from a few weeks to months to fix.

The study, published in the journal Earthquake Engineering and Structural Dynamics, uses simulations to analyze what effect prior damage has on the performance of these towers once a second hazard strikes. Their findings suggest that previous damage has a considerable impact on the fragility and reliability of these networks if it can’t be repaired before the second hazard hits, said Abdollah Shafieezadeh, co-author of the study and an associate professor of civil, environmental and geodetic engineering.

“Our work aims to answer if it’s possible to design and manage systems in a way that not only minimizes their initial damage but enables them to recover faster,” said Shafieezadeh.

The machine learning model not only found that a combination of an earthquake and hurricane could be particularly devastating to the electrical grid, but that the order of the disasters may make a difference. The researchers found that the probability of a tower collapse is much higher in the event of an earthquake followed by a hurricane than the probability of failure when the hurricane comes first and is followed by an earthquake.

That means while communities would certainly suffer some setbacks in the event that a hurricane precedes an earthquake, a situation wherein an earthquake precedes a hurricane could devastate a region’s power grid. Such conclusions are why Shafieezadeh’s research has large implications for disaster recovery efforts.

“When large-scale power grid systems are spread over large geographic areas, it’s not possible to carefully inspect every inch of them very carefully,” said Shafieezadeh. ”Predictive models can help engineers or organizations see which towers have the greatest probability of failure and quickly move to improve those issues in the field.”

After training the model for numerous scenarios, the team created “fragility models” that tested how the structures would hold up under different characteristics and intensities of natural threats. With the help of these simulations, researchers concluded that tower failures due to a single hazardous event were vastly different from the pattern of failures caused by multi-hazard events. The study noted that many of these failings occurred in the leg elements of the structure, a segment of the tower that helps bolt the structure to the ground and prevents collapse.

Overall, Shafieezadeh said his research shows a need to focus on re-evaluating the entire design philosophy of these networks. Yet to accomplish such a task, much more support from utilities and government agencies is needed.

“Our work would be greatly beneficial in creating new infrastructure regulations in the field,” Shafieezadeh said. “This along with our other research shows that we can substantially improve the entire system’s performance with the same amount of resources that we spend today, just by optimizing their allocation.”

This work was supported by the Korea Institute of Energy Technology Evaluation and Planning (KETEP) and the Ministry of Trade, Industry & Energy of the Republic of Korea (MOTIE).

1 15 16 17 18 19 60