Natural hazard triggered industrial accidents: Are they Black Swans?

A recently published JRC study examines whether technological accidents caused by natural hazards (Natech accidents) are real “Blacks Swans” (unpredictable and hence unpreventable events), identifies their possible causes and discusses effective strategies to manage extreme risks.
The study concludes that the Black Swan metaphor is overused for technological accidents in general and Natech accidents in particular, whose recurrence raises questions about the effectiveness of corporate oversight and the application of state-of-the-art knowledge in managing risks.
What are Natech accidents?
Natech accidents occur when the natural and technological worlds collide, wherever hazardous industry is located in areas prone to natural hazards. Past Natech accidents have often had significant impacts on public health, the natural and built environment, and the local, national or even global economy.
Major technological accidents considered unpreventable are occasionally called Black Swan events. Three features characterize a Black Swan:
- it must be an outlier with respect to normal expectations, making it unpredictable;
- it has to have a major impact;
- it can be explained in hindsight, making it appear predictable.
Inadequate risk management and organisational risk blindness
A closer look at past Natech accidents shows that the vast majority of these events, if not all, could have been foreseen and prevented using available information and knowledge prior to the disaster. They can therefore not be considered inevitable or Black Swans.
The JRC study provides a detailed analysis of the reasons for why Natech risks are often underestimated:
- Risk management traditions and the Act-of-God mindset - The focus for managing natural risks has traditionally been on the response side and hence on disaster management, rather than on prevention and risk management, whereas the technological-risk community has always focused on risk- rather than disaster management. Natech risk is sandwiched between these two worlds, and neither community feels very much at ease with taking ownership of the risk;
- Complexity of Natech risk scenarios - Natech risk analysis would need extensions to traditional risk-analysis methodologies in order to cover the multi-hazard nature of the risk and the multitude of possible simultaneous scenarios;
- Risk governance and risk management problems due to the multi-stakeholder and multi-hazard nature of Natech risks, and the multitude of possibly conflicting issues that are usually on a manager’s radar screen;
- Socio-economic context, including group interests and power, economic pressure, and public or media indifference; and
- Human fallacies and cognitive biases that can corrupt the experiences we draw on for estimating risks.
Managing extreme risks
Building organisational resilience is key to managing risks effectively, in particular in high-risk industry. The JRC study discusses possible strategies to reduce extreme risks, prepare better for their consequences, and make Black Swans more accessible:
- Risk-based versus precaution-based strategies
- Disaster incubation theory and warning signals
- Mindfulness
- Resilience engineering
- Scenario planning
- Red teaming
While the JRC study is centered on Natech risks, it is generally applicable to managing also other types of extreme or low-probability risks.

Report First 3 months of 2021 brought billion-dollar disaster, warm start to spring for U.S.

Since January, conditions across the U.S. have been running warmer and wetter than normal. The nation also recorded its first billion-dollar weather and climate disaster of 2021 — the deadly deep freeze that enveloped much of the central U.S. in February — and two tornado outbreaks in late March.
The month of March turned out a bit warmer and drier than average, according to NOAA’s National Centers for Environmental Information.
Here are more highlights from NOAA’s latest monthly U.S. climate report:
Climate by the numbers
Year to date | Billion-Dollar Disasters
The average U.S. temperature for the year to date (January through March) was 36.9 degrees F (1.8 degrees above average), which ranked in the warmest third of the record.
The contiguous U.S. also kicked off the year a little on the wet side, with a year-to-date average rainfall of 6.55 inches — 0.41 of an inch above average.
Most notable, the U.S. saw its first billion-dollar disaster of 2021 that had a devastating death toll: At least 125 people died as a direct or indirect result of a mid-February blanket of arctic weather that dropped temperatures to historic lows across the central United States. Texas experienced the majority of the property and infrastructure losses that were incurred by more than a dozen states. The preliminary total damage estimate for this extreme event — in excess of $10 billion — makes it the most costly winter weather disaster on record for the U.S., surpassing the so-called “Storm of the Century” that struck the Gulf Coast all the way up to Maine in 1993.
March 2021
The average monthly temperature across the contiguous U.S. was 45.5 degrees F (4.0 degrees above the 20th-century average) and ranked in the warmest third of the climate record.
Above-average temperatures were observed across much of the country, from the Northwest to the Northeast, as well as from the Great Lakes to the Gulf of Mexico. North Dakota, for example, had its fourth-warmest March on record.
The average precipitation in the contiguous U.S. last month was 2.45 inches (0.06 of an inch below average), ranking in the middle third of the climate record.
Below-average precipitation fell across the Northwest, northern Plains, and Northeast, as well as portions of the Southeast, Deep South and West. Both Montana and North Dakota saw their second-driest March in 127 years.
More notable climate events in March
Deadly tornado outbreaks: In March, two rounds of deadly severe weather and tornadoes raked the U.S. South. More than 100 tornadoes were reported during the two outbreaks (March 17-18 and March 25-27). One particularly violent EF3-tornado struck Calhoun County, Alabama, on March 25 and caused five deaths.
A chilly, wintry month for Alaska: The state shivered through its coldest March in four years. The average March temperature for Alaska was 7.2 degrees F, 3.6 degrees below the long-term average. The capital city of Juneau reported its snowiest March since 2007.
Drought improved slightly: By the end of March, the U.S. Drought Monitoroffsite link reported that nearly 44% of the contiguous U.S. was in drought, down from 46.6% at the beginning of the month. Drought improved across parts of the central Rockies, central Plains, Puerto Rico and Hawaii.

Fujitsu Leverages World's Fastest Supercomputer and AI to Predict Tsunami Flooding

A new AI model that harnesses the power of the world's fastest supercomputer, Fugaku, can rapidly predict tsunami flooding in coastal areas before the tsunami reaches land.
The development of the new technology was announced as part of a joint project between the International Research Institute of Disaster Science (IREDeS) at Tohoku University, the Earthquake Research Institute at the University of Tokyo, and Fujitsu Laboratories.
The 2011 Great East Japan Earthquake and subsequent tsunami highlighted the shortcomings in disaster mitigation and the need to utilize information for efficient and safe evacuations.
While tsunami observation networks in Japanese coastal waters have been strengthened since then, using the data produced from those networks to predict a tsunami's path once it hits land has gained greater urgency. This is especially true since a major earthquake is likely to hit Japan's densely populated east coast sometime in the near future.
Tsunami prediction technologies will allow authorities to obtain accurate information quickly and aid them in effectively directing evacuation orders.
Fujitsu, Tohoku University, and The University of Tokyo leveraged the power of Fugaku to generate training data for 20,000 possible tsunami scenarios based on high-resolution simulations. These scenarios were used to streamline an AI model that uses offshore waveform data generated by the tsunami to predict flooding before landfall at high spatial resolution.
Conventional prediction technologies require the use of supercomputers and make rapid prediction systems difficult to implement. The current AI model, however, can be run in seconds on ordinary PCs.
When the model was applied to a simulation of tsunami flooding in Tokyo Bay following a large earthquake, it achieved highly accurate predictions with a regular PC within seconds. The results matched tsunami flooding of the tsunami source models released by the Cabinet Office of Japan.
The research team will continue to make use of Fugaku's high-speed performance in the future by training the system with additional tsunami scenarios. Doing so will help realize AI that can predict tsunami flooding over even wider areas.

Forests with diverse tree sizes and small clearings hinder wildland fire growth

Novel 3D computational study links observable forest characteristics with fire behavior and reveals how forest structure propagates fire
A new 3D analysis shows that wildland fires flare up in forests populated by similar-sized trees or checkerboarded by large clearings and slow down where trees are more varied. The research can help fire managers better understand the physics and dynamics of fire to improve fire-behavior forecasts.
“We knew fuel arrangement affected fire but we didn’t know how,” said Adam Atchley, lead author on a Los Alamos National Laboratory-led study published today in the International Journal of Wildland Fire. “Traditional models that represent simplified fuel structures can’t account for complex wind and varied fire response to actual forest conditions. Our study incorporated a varied, 3D forest and wind behavior. Adding diverse tree sizes and shapes slowed fire quite a bit, as did adding small gaps between trees. By examining the physics of fire-fuel behavior, we are able to see fundamentally how forest structure affects behavior.”
The study for the first time links generalized forest characteristics that can be easily observed by remote sensing and modeled by machine learning to provide insight into fire behavior, even in large forested areas.
Understanding how wildland fire behaves is necessary to curb its spread, and also to plan safe, effective prescribed burns. However, data is limited, and most studies are too simplistic to accurately predict fire behavior. To predict how fire will move through a forest, it is necessary to first paint an accurate picture of a typical forest’s diversity with varying density, shapes, and sizes of trees. But this is computationally expensive, so most studies target homogenous forests that rarely occur in nature.
Using its award-winning model, FIRETEC, on high-performance computers at Los Alamos, the team ran 101 simulations with U.S. Forest Service data for Arizona pine forests to realistically represent the variability of forests. The simulations coupled fire and atmospheric factors—such as wind moving through trees—at fine scales to provide a 3D view of how fire, wind, and vegetation interact.
To understand how the forest structure affects fire behavior, Atchley and colleagues repeated simulations with minor changes in the forest structure, which they made by moving trees and randomizing tree shapes. Small changes had monumental impact in fire behavior. However, despite highly variable fire behavior, observable forest characteristics, such as tree diversity and the size of a stand of trees or a clearing, also substantially control how fire spreads.
Results show that the more detailed and varied simulated forest decreases the forward spread of fire spread due to a combination of fuel discontinuities and increases fine-scale turbulent wind structures. On the other hand, large clearings can increase fire spread.