Skip to content

Recently, NIST has published a new report titled "Further Development of a Conceptual Framework for Assessing Resilience at the Community Scale." I am happy to say that I was a co-author on this report with Alexis Kwasinski, Joseph Trainor, Cynthia Chen, and Francis Lavelle. It is my pleasure to share with you the abstract below:

The National Institute of Standards and Technology (NIST) is sponsoring the Community Resilience Assessment Methodology (CRAM) project. The CRAM project team is working in parallel with several other NIST initiatives, including: the Community Resilience Planning Guide for Buildings and Infrastructure Systems (https://www.nist.gov/el/resilience/community-resilience- planning-guides), the Center for Risk-Based Community Resilience Planning (http://resilience.colostate.edu/), and the Community Resilience Panel for Buildings and Infrastructure Systems (https://www.crpanel.org/). The objective of the CRAM project is to develop a foundation for assessing resilience at the community scale. For the purposes of this project, community resilience is defined as “the ability to prepare for and adapt to changing conditions and to withstand and recover rapidly from disruptions” (PPD-21 2013), and a community is defined as “a place designated by geographical boundaries that functions under the jurisdiction of a governance structure, such as a town, city, or county” (NIST 2015). This report continues the develop the concept of community dimensions and services and expands the concept to the dimensions of sustenance, housing and shelter, relationships, and education.

For this week's TWIST (This week in infrastructure systems) post, I want to do things just a bit differently and focus on a topic that is crucial for any infrastructure system: uncertainty framing.

Of course, it is very difficult to agree on how to define uncertainty, and once it's defined, it can be difficult to select robust tools for managing the types of uncertainties we see in infrastructure systems. Since infrastructures are characterized by long life cycles, large geographic and demographic scope, and substantial interconnections within and between lifeline systems, one wonders how any problems are selected for analysis. The web of intricacies faced by analysts and policy makers can be intractable, and the ways that the unknowns influence the likelihoods of the possible consequences makes every choice high-stakes. Some professionals call these problems "wicked," and prefer to "muddle-through" them, take a garbage can approach, or just admit that optimal solutions are probably not possible and accept the best feasible option--to our knowledge--at the time. Others call these "deep uncertainties" and even wonder whether resilience analysis is more appropriate than risk analysis for infrastructure systems.

However you choose to sort all that out, this issue is of critical importance to infrastructure enthusiasts today. In the US, we face a crisis of governance, in which the public trusts neither government nor experts, the center no longer holds--making it impossible to provide legislative/political stability for public engagement over the scientific debates, and our most important issues are fraught with uncertainties that make it impossible to provide an unequivocally recommended course of action. Of course, infrastructure is impossible without both strong governance and strong science (or trans-science, if you prefer). With that in mind, two articles stood out from Water Resources Research this week:

  • Rival Framings: A Framework for Discovering how Problem Formulation Uncertainties Shape Risk Management Tradeoffs in Water Resources Systems. In this paper, Quinn et al. explore how rival problem (read: uncertainty) framing could lead to unintended consequences as a result of inherent bias in the selected formulation. Of course, this is unavoidable for even modest problems in critical infrastructure systems, and so they provide some guidance for carefully exploring the possible consequences that can be foreseen under alternative problem formulations.
  • Towards best practice framing of uncertainty in scientific publications: a review of Water Resources Research abstracts. In this paper, Guillaume et al. describe how awareness of uncertainty is addressed within WRR abstracts/papers. They develop an uncertainty framing taxonomy that is responsive to five core questions: "Is the conclusion ready to be used?"; "What limitations are there on how the conclusion can be used?"; "How certain is the author that the conclusion is true?"; "How thoroughly has the issue been examined?"; and, "Is the conclusion consistent with the reader’s prior knowledge?". Of course, as the authors acknowledge, the study of uncertainty framing is inter-disciplinary, and achieving an uncertainty framing that is responsive to these questions is an art in itself.

Uncertainty, to me, is both fearsome and beautiful. I hope these two articles, or some of the other links shared, provide some useful thoughts for managing uncertainty in your own study or management of infrastructure systems.

Today, I'm pleased to present a guest entry from SEED Ph.D. student, Vikram Rao.  This article, an advance from Risk Analysis by Stephanie Chang and colleagues, is an exciting introduction to the use of expert judgment to investigate infrastructure resilience.  Traditionally, expert elicitation is used to evaluate probabilities to assess the vulnerability of a critical system to outages of feeder systems or incidence of extreme exogenous events.  In this article, Chang and colleagues emphasize the use of expert elicitation to assess such resilience quantities as time to recover and disruption to system services over time.  I hope you enjoy this as much as I did, and thank you Vikram for your insights...

This article examines resilience of infrastructure systems using expert judgments. This is of interest since disasters such as earthquakes can cause multiple failures of infrastructure systems since they are interdependent. The approach here is to characterize system resilience, understand the relationships between interdependent systems in the context of resilience, and understand ways to improve resilience, which is of interest to risk managers. Many infrastructure systems are considered here, including water, electricity, and healthcare.

The researchers use expert judgments in a non-probabilistic approach. One goal is to elicit the service disruption levels, given as degree of impact/degree of extent, for numerous sectors. Interdependency diagrams show the dependencies between systems and provide clues as to the cascading nature of disaster events. For example, healthcare is heavily dependent on water, which tells health risk managers that it is advisable to have alternate water sources available in the event of emergency. One thing I find interesting is that there is no agreement on the extent of infrastructure reliance on water. Some studies claim that water is needed for other infrastructures to function, others do not. So the importance of water in infrastructure resilience remains to be seen.

When discussing the results, the authors bring up the fact that the representatives (experts) revise their judgments in the face of new information. Experts realize that the importance of a system is greater than originally believed, or that interdependencies exist that they had not considered. Since infrastructure systems are so interdependent and functional systems are critical for human well-being, the sharing of information between infrastructure systems is needed going forward.

One area I would like to see additional research is to explore resilience in water distribution systems, particularly looking at costs associated with disaster recovery and time to restore water distribution functionality. We could use expert judgments to examine the quantitative nature of water system resilience, for example eliciting the cumulative distribution of water functionality as a function of time (e.g. 25% water functionality restored after 1 week, 75% after 3 weeks). This is of course valuable to risk managers who are seeking to understand the nature of water system functionality in the wake of a disaster.

At this year's European Safety and Reliability Association (ESRA) annual meeting, ESREL 2013, Dr. Francis presented a discussion paper co-authored with GW EMSE Ph.D. student Behailu Bekera on the potential for using an entropy-weighted resilience metric for prioritizing risk mitigation investments under deep uncertainty.

The goal of this metric is to build upon tools such as info-gap decision theory, robust decision making approaches, scenario and portfolio analysis, and modeling to generate alternatives in order to develop resilience metrics that account for a couple ideas.  First, we felt that if an event is very extreme, it would be quite difficult to prepare for that event whether or not the event was correctly predicted.  Thus, resilience should be "discounted" by the extremeness of the event.  Second, we felt that if an event is characterized by an uncertainty distribution obtained through expert judgment, the degree to which the experts disagree should also "discount" the resilience score.  In our paper, the goal was to present the entropy-weighted metric we'd developed in our RESS article to the ESREL audience in order to engender some discussion about how to evaluate resilience under these conditions.  This work was inspired by a talk Dr. Francis attended by Woody Epstein of Scandpower Japan in PSAM11/ESREL12 Helsinki.

The paper and short slides now appear on the publications page of the SEED research blog.

I have been away from writing on the blog, even my personal opinions on current research topics (OK, that's what almost all of this writing is) due to travel, deadlines, and other obligations.  I do want to take an opportunity to announce that a new paper from the SEED research group co-authored by Dr. Francis and Behailu Bekera has just been accepted for publication in the journal Reliability Engineering and System Safety.  I am very excited about this, because I enjoy reading articles from this journal, and have found this research community engaging and interesting in person, as well as on paper.  I'll write a more "reflective" entry about this sometime later, but if you'd like to take a look at the paper, please find it here.  We will be presenting an earlier version of this work as a thought piece at ESREL 2013.  More on the conference paper closer to the date of the conference in October.

Today, I'm presenting a guest post from Behailu Bekera, a first-year EMSE PhD student working in the SEED Group.  He is studying the relationship between risk-based and resilience-based approaches to systems analysis.

Resilience is defined as the capability of a system with specific characteristics before, during and after a disruption to absorb the disruption, recover to an acceptable level of performance, and sustain that level for an acceptable period of time. Resilience is an emerging approach towards safety. Conventional risk assessment methods are typically used to determine the negative consequences of potential undesired events, understand the nature of and to reduce the level of risk involved. In contrast, the resilience approach emphasizes on anticipation of potential disruptions, giving appropriate attention to perceived danger and establishing response behaviors aimed at either building the capacity to withstand the disruption or recover as quickly as possible after an impact. Anticipation refers to the ability of a system to know what to expect and prepare itself accordingly in order to effectively withstand disruptions. The ability to detect the signals of an imminent disruption is captured by the attentive property of resilience. Once the impact takes place, the system must know how to efficiently respond with the aim of quick rebound.

Safety, as we know it traditionally, is usually considered as something a system or an organization possesses as evidenced by the measurements of failure probability, risk and so on. Concerning the new approach, Hollnagel and Woods argue that safety is something an organization or a system does. Seen from a resilience point of view, safety is a characteristic of how a system performs in the face of disruptions, how it can absorb or dampen the impacts or how it can quickly reinstate itself after suffering perturbation.

Resilience may allow for a more proactive approach for handling risk. It puts the system on a path of continuous performance evaluation to ensure safety at all times. Resilient systems will be flexible enough to accommodate different safety issues in multiple dimensions that may arise and also robust enough to maintain acceptable performance.