Skip to content

For this week's TWIST (This week in infrastructure systems) post, I want to do things just a bit differently and focus on a topic that is crucial for any infrastructure system: uncertainty framing.

Of course, it is very difficult to agree on how to define uncertainty, and once it's defined, it can be difficult to select robust tools for managing the types of uncertainties we see in infrastructure systems. Since infrastructures are characterized by long life cycles, large geographic and demographic scope, and substantial interconnections within and between lifeline systems, one wonders how any problems are selected for analysis. The web of intricacies faced by analysts and policy makers can be intractable, and the ways that the unknowns influence the likelihoods of the possible consequences makes every choice high-stakes. Some professionals call these problems "wicked," and prefer to "muddle-through" them, take a garbage can approach, or just admit that optimal solutions are probably not possible and accept the best feasible option--to our knowledge--at the time. Others call these "deep uncertainties" and even wonder whether resilience analysis is more appropriate than risk analysis for infrastructure systems.

However you choose to sort all that out, this issue is of critical importance to infrastructure enthusiasts today. In the US, we face a crisis of governance, in which the public trusts neither government nor experts, the center no longer holds--making it impossible to provide legislative/political stability for public engagement over the scientific debates, and our most important issues are fraught with uncertainties that make it impossible to provide an unequivocally recommended course of action. Of course, infrastructure is impossible without both strong governance and strong science (or trans-science, if you prefer). With that in mind, two articles stood out from Water Resources Research this week:

  • Rival Framings: A Framework for Discovering how Problem Formulation Uncertainties Shape Risk Management Tradeoffs in Water Resources Systems. In this paper, Quinn et al. explore how rival problem (read: uncertainty) framing could lead to unintended consequences as a result of inherent bias in the selected formulation. Of course, this is unavoidable for even modest problems in critical infrastructure systems, and so they provide some guidance for carefully exploring the possible consequences that can be foreseen under alternative problem formulations.
  • Towards best practice framing of uncertainty in scientific publications: a review of Water Resources Research abstracts. In this paper, Guillaume et al. describe how awareness of uncertainty is addressed within WRR abstracts/papers. They develop an uncertainty framing taxonomy that is responsive to five core questions: "Is the conclusion ready to be used?"; "What limitations are there on how the conclusion can be used?"; "How certain is the author that the conclusion is true?"; "How thoroughly has the issue been examined?"; and, "Is the conclusion consistent with the reader’s prior knowledge?". Of course, as the authors acknowledge, the study of uncertainty framing is inter-disciplinary, and achieving an uncertainty framing that is responsive to these questions is an art in itself.

Uncertainty, to me, is both fearsome and beautiful. I hope these two articles, or some of the other links shared, provide some useful thoughts for managing uncertainty in your own study or management of infrastructure systems.

There is an exciting new opportunity to affect the field of risk analysis. The Society for Risk Analysis [SRA] Council, and the SRA Specialty Group for Foundations in Risk Analysis has constructed a new glossary aimed at developing an authoritative dictionary of terms used in risk analysis. Comments are currently being welcomed as the SRA Council is well aware that it may be difficult to agree on just one set of definitions. The description found on the SRA website is as follows:

The Council of the Society of Risk Analysis (SRA) has initiated a work on preparing a new SRA glossary.

A committee has been established to develop the glossary, and a draft version was presented to the SRA Council December 7, 2014. The response was very positive and a plan for how to proceed was approved. The objective is to have a final version ready for approval by the SRA Council in their June 2015 meeting. The committee welcomes  comments and suggestions to the draft glossary to further improve the definition texts and incorporate alternative views and perspectives; please send them to terje.aven@uis.no. Deadline 28 February 2015.

To access the draft glossary press here.

Terje Aven
Leader of the Committee

Please be sure to provide your comments by 28 February 2015.

When teaching about risk and uncertainty analysis, one of the questions I often have is "How do my students' worldviews influence their conceptualization of risk?" I thought one area to search in order to answer this question is the theological literature about risk. I felt that some of these scholars might have something to say about this problem, and I'd recently come across an edited volume by Niels Henrik Gregersen titled Information and the Nature of Reality, so I figured I might be able to start with him based on what I'd seen from those discussions.

Among the first articles I read was an article by Gregersen called "Risk and Religion: Toward a Theology of Risk Taking." [Gregersen (2003), Zygon 38(2), p. 355-376] I am not quite finished with the article, but it seems the he is suggesting changing the traditional approach to risk (i.e., Risk = Probability x Consequence) to explicitly indicate that risk is difficult to calculate since it includes both our evaluation of the dangers, and the compound events composed by our responses to external events that can occur (i.e., Risk = int(Pr[outcome|our decisions, external event]Pr[external event]) x Evaluation[outcome|our decisions, external event]... or something like this). His discussion then seems to indicate that since the risks we are faced with are increasingly related to our decisions and not the consequences of natural events alone, that is, second-order versus first-order risks, in the long-run a risk-welcoming attitude may be more virtuous than a risk-averse one.

While reading, I was impressed upon by so many ideas I wanted to immediately share with close friends who are familiar with my professional interest in risk. I would read a paragraph, then imagine my friend's response to it. Read another, imagine the response. This article had so much that I could easily see coming up in discussions around the dinner table or at the coffee shop. Ultimately, I felt compelled to stop and share with you all this excerpt publicly:

Risk and fate cannot be pitted against each another, because the former always takes place within the framework of the latter. Expressed in theological terms, the world is created by a benevolent God in such a manner that it invites a risk-taking attitude and rewards it in the long term. Risk taking is a nonzero-sum game. The gifts of risk taking are overall greater than the potential damages, and by risking one’s life one does not take anything away from others; the risk taker explores new territories rather than exploiting the domains of the neighbor. (p. 368)

It is possible that you reading this now do not share my theistic worldview. Nonetheless, we must remember that risk taking is a fundamental and worthy component of our human enterprise. I spend almost all my time studying, evaluating, and developing methods to reduce or plan for risks we want to avoid. Most of the time, these risks are imposed on others by the decisions of a third party. Sometimes, these risks are framed as questions of reliability in complex systems. But for me, they are rarely framed as venture.

For me, I think the nature of the risks we often employ professional advice to explore makes us likely to forget this. Risk can, and probably should, only be considered in the context of venture-a great gain deliberately pursued in view of an examined possibility of adverse consequences. While reading this article as part of a larger interest in understanding how worldview frames and addresses risk, I can't help feeling a bit uncomfortable about this statement. I agree with it 100%. Yet I feel we don't accept its implications. I think our public interest in risks associated with complex systems makes this challenging, and I don't have any good answers.

Today, I'm pleased to present a guest entry from SEED Ph.D. student, Vikram Rao.  This article, an advance from Risk Analysis by Stephanie Chang and colleagues, is an exciting introduction to the use of expert judgment to investigate infrastructure resilience.  Traditionally, expert elicitation is used to evaluate probabilities to assess the vulnerability of a critical system to outages of feeder systems or incidence of extreme exogenous events.  In this article, Chang and colleagues emphasize the use of expert elicitation to assess such resilience quantities as time to recover and disruption to system services over time.  I hope you enjoy this as much as I did, and thank you Vikram for your insights...

This article examines resilience of infrastructure systems using expert judgments. This is of interest since disasters such as earthquakes can cause multiple failures of infrastructure systems since they are interdependent. The approach here is to characterize system resilience, understand the relationships between interdependent systems in the context of resilience, and understand ways to improve resilience, which is of interest to risk managers. Many infrastructure systems are considered here, including water, electricity, and healthcare.

The researchers use expert judgments in a non-probabilistic approach. One goal is to elicit the service disruption levels, given as degree of impact/degree of extent, for numerous sectors. Interdependency diagrams show the dependencies between systems and provide clues as to the cascading nature of disaster events. For example, healthcare is heavily dependent on water, which tells health risk managers that it is advisable to have alternate water sources available in the event of emergency. One thing I find interesting is that there is no agreement on the extent of infrastructure reliance on water. Some studies claim that water is needed for other infrastructures to function, others do not. So the importance of water in infrastructure resilience remains to be seen.

When discussing the results, the authors bring up the fact that the representatives (experts) revise their judgments in the face of new information. Experts realize that the importance of a system is greater than originally believed, or that interdependencies exist that they had not considered. Since infrastructure systems are so interdependent and functional systems are critical for human well-being, the sharing of information between infrastructure systems is needed going forward.

One area I would like to see additional research is to explore resilience in water distribution systems, particularly looking at costs associated with disaster recovery and time to restore water distribution functionality. We could use expert judgments to examine the quantitative nature of water system resilience, for example eliciting the cumulative distribution of water functionality as a function of time (e.g. 25% water functionality restored after 1 week, 75% after 3 weeks). This is of course valuable to risk managers who are seeking to understand the nature of water system functionality in the wake of a disaster.

I have been away from writing on the blog, even my personal opinions on current research topics (OK, that's what almost all of this writing is) due to travel, deadlines, and other obligations.  I do want to take an opportunity to announce that a new paper from the SEED research group co-authored by Dr. Francis and Behailu Bekera has just been accepted for publication in the journal Reliability Engineering and System Safety.  I am very excited about this, because I enjoy reading articles from this journal, and have found this research community engaging and interesting in person, as well as on paper.  I'll write a more "reflective" entry about this sometime later, but if you'd like to take a look at the paper, please find it here.  We will be presenting an earlier version of this work as a thought piece at ESREL 2013.  More on the conference paper closer to the date of the conference in October.

this post is about measuring ignorance.

yes, that sounds weird, but in systems engineering it's a very big deal now.  as our complex engineered systems progress in their reliability, we are trying as hard as possible to make sure that we can anticipate all the ways they can possibly fail.  essentially, we are everyday trying to imagine the most outlandish environmental/geopolitical/economic/technical disasters that can happen, put them all together at the same time, and determine whether our systems can withstand that.

naturally, one expects our military to do this as a matter of course: but your humble civil engineer? not sure that's crossed too many folks' minds.  at least not until Fukushima, Deepwater Horizon, and now Hurricane Sandy occurred. now, everyone are wondering how in the world we allowed our systems to perform so poorly under those circumstances.

the problem is not necessarily the safety or reliability of the systems: how often do you and i in the US plan our day around the scheduled load shedding at just about coffee hour? or purchase bottled water to shower in because the water's too dirty to touch? even on the DC Metro or MARC trains, the [relatively] frequent delays are not an important consideration in my daily planning.

the problem is generally that we couldn't possibly anticipate the things that cause our systems to deviate from intended performance.  and short of prophetic revelation, there's not a good way to do that.

there are, however cool ways to explore possibilities at the edge of the realm of possibility.

i have in mind things like robust decision making, info-gap analysis, portfolio evaluation, and modeling to generate alternatives.  some of these tools are older than i am (modeling to generate alternatives) but are only recently finding wider application via the explosion of bio-inspired computing (e.g., genetic algorithms, particle swarm optimization, ant colony optimization, genetic computing, etc.); while others are becoming established tools in the lexicon of risk and systems analysis even as we speak.

for example, info-gap analysis avoids restricting ourselves to decision contexts in which externally valid models can be identified in order to predict the future.  instead, info-gap computes the robustness and opportuneness of a strategy in light of one's wildest dreams [or nightmares] about the future.  in this way, one can be surprised not only about how bad things might turn out, but also how good they might be.

i personally am more partial to robust decision making, as it uses mathematics and terminology i am a bit more familiar with.  robust decision making enables us to evaluate circumstances in which either we can agree on an externally valid model of the future or we would like to discuss a range of competing interpretations and assumptions. one starts with a set of alternatives, and iterates through the futures that the strategies under consideration might suggest.  after a set of futures has been identified, the areas of future vulnerability are identified for the strategies being evaluated.  portfolio evaluation shares many similarities with robust decision making, in that the stakeholders project differing interpretations that may imply competing priorities for the decision context.

while in all of these techniques, it is generally agreed that one of the major weaknesses of existing risk and decision theory is the reliance on probability models that represent possible futures, i can't give up my addiction for Bayesian statistics.  at least when i decide i need rehab, there seems to be a sound selection of medications to choose from.

In this post, Behailu Bekera, a 2nd-year PhD student in the SEED group discusses the role of robust decision making under deep uncertainties.  This post was inspired by a reading of Louis Anthony Cox's "Confronting Deep Uncertainties in Risk Analysis" in a recent issue of Risk Analysis.

There is no one good model that can be used to assess deep uncertainties. Hence our decisions about complex systems or decision contexts are typically made based on insufficient knowledge about the situation. Deep uncertainties are characterized by multiplicity of future events and an unknown future. So, being able to precisely anticipate undesired events in the future and conducting the necessary preparations would be an example of a decision context with deep uncertainty. In this article, Tony offers recognition to ideas from robust optimization, adaptive control and machine learning that seem promising for dealing with deep uncertainties in risk analysis.

Using multiple models and relevant data to improve decisions, average forecasting, resampling data that allows robust statistical inferences despite model uncertainty, adaptive sampling and modeling, and Bayesian model averaging for statistical are some of the tools that can assist in robust risk analysis involving deep uncertainties.

The robust risk analysis techniques shift the focus of risk analysis from addressing passive aspects of it, such as identifying likelihood events (disruptions) and their associated [estimated] consequences to more action-oriented questions.  Active questions such as how we should act now to effectively mitigate the occurrence or consequences of events with highly undesirable effects in the future.

Robust decision making, for instance, is used by developing countries to identify potential large-scale infrastructure development projects and investigate possible vulnerabilities that require profound attention of all stakeholders.  Additionally, adaptive risk management may be used to maintain reliable network structure to ensure service continuity despite failures. This sort of techniques can be considerably important in the areas of critical infrastructure protection and resilience.

Through these emerging methods, Dr. Cox, makes important suggestions for making robust decisions in the face of extreme uncertainty in spite of our incomplete or inadequate knowledge.  This will be an important paper for those looking to advance the science of robust decision-making and risk analysis.

Recently in a SEED group paper discussion, we re-visited the words of Stan Kaplan in his 1997 "The Words of Risk Analysis."  This is a transcript of a plenary lecture delivered to an Annual Meeting of the Society for Risk Analysis.  SEED Ph.D. student Vikram Rao presents some highlights from this article that I'm sure you'll enjoy as well.

I enjoyed the Kaplan article. I liked how it was written in an informal style which helped the article flow well and was easy to understand. It was a good introduction to risk analysis. It asked the three questions needed for a risk assessment – What can happen?, How likely is it?, and What are the consequences? These constitute the risk triplet. The diagrams were helpful, especially the dose response curve and the evidenced based approach.  And the diagrams that explained the decision making process with QRA were especially useful to get an overview of the whole process.

We need to recognize that risk assessments are going to be a bigger part of our decision making process considering the complexity of systems today. Systems such as aircraft, cars, and microprocessors have so many parts that the complexity is even bigger than before. Mitigating risks is a key to having successful complex systems. We need to be able to identify risks and have strategies for overcoming them. We can do this by eliciting expert opinions, doing simulations, and increasing our knowledge base.

We also see the rise in risks that have consequences on a national and global scale, such as global warming and climate change. By recognizing that effective risk mitigating strategies have vast importance today we can prepare ourselves well for the challenges of the future.

Today's plenary lunch included two interesting, high-level talks on several different dimensions of public health risk. While MacIntyre was focused on bioterrorism, Flahault was more wide ranging, with a general vision for changes in public health systems.

I have been fascinated on my international travels the last few weeks on the diversity of approaches to risk internationally. While you should by no means generalize from my remarks, it seems like there are a couple camps that focus on behavior modification and regulation, while others focus on the role of individual agents as key proponents in hazard exposures. In addition, engineers approach problems quite differently from basic scientists and they from social scientists and government agents. Of course there is much overlap. As a result, the foci in the technical presentations can vary quite widely.

I would say that engineers and basic scientists use scenario based approaches such as PRA, fault trees, and influence diagrams in their studies; more social science inclined professionals focus on the role of institutions in risk management and framing. Although we speak the same language at 30000 feet, the diversity in the details is truly fascinating.

My pressing question is How do we get folks involved in this earlier in life? How do we discuss the world of risk in a way that kids and young adults see the drama involved in finding out dangers and uncertainties germane to modern and global life?

Today, Dr. Francis is giving a talk titled "Two Studies in Using Graphical Model for Infrastructure Risk Models" discussing some recent peer-reviewed conference papers given at ICVRAM and PSAM11/ESREL12.  The abstract for today's talk is:

In this talk, I will discuss the use of Bayesian Belief Networks (BBNs) and Classification and Regression Trees (CART) for infrastructure risk modeling.  In the first case study, we focus on supporting risk models used to quantify economic risk due to damage to building stock attributable to hurricanes. The increasingly complex interaction between natural hazards and human activities requires more accurate data to describe the regional exposure to potential loss from physical damage to buildings and infrastructure. While databases contain information on the distribution and features of the building stock, infrastructure, transportation, etc., it is not unusual that portions of the information are missing from the available databases. Missing or low quality data compromise the validity of regional loss projections. Consequently, this paper uses Bayesian Belief Networks and Classification and Regression Trees to populate the missing information inside a database based on the structure of the available data. In the second case study, we use Bayesian Belief Networks (BBNs) to construct a knowledge model for pipe breaks in a water zone.  BBN modeling is a critical step towards real-time distribution system management.  Development of expert systems for analyzing real-time data is not only important for pipe break prediction, but is also a first step in preventing water loss and water quality deterioration through the application of machine learning techniques to facilitate real-time distribution system monitoring and management.  Our model is based on pipe breaks and covariate data from a mid-Atlantic United States (U.S.) drinking water distribution system network. The expert model is learned using a conditional independence test method, a score-based method, and a hybrid method, then subjected to 10-fold cross validation based on log-likelihood scores.

This talk is hosted by Ketra Schmitt in the Center for Engineering in Society on the Faculty of Engineering and Computer Science.