Skip to content

this post is about measuring ignorance.

yes, that sounds weird, but in systems engineering it's a very big deal now.  as our complex engineered systems progress in their reliability, we are trying as hard as possible to make sure that we can anticipate all the ways they can possibly fail.  essentially, we are everyday trying to imagine the most outlandish environmental/geopolitical/economic/technical disasters that can happen, put them all together at the same time, and determine whether our systems can withstand that.

naturally, one expects our military to do this as a matter of course: but your humble civil engineer? not sure that's crossed too many folks' minds.  at least not until Fukushima, Deepwater Horizon, and now Hurricane Sandy occurred. now, everyone are wondering how in the world we allowed our systems to perform so poorly under those circumstances.

the problem is not necessarily the safety or reliability of the systems: how often do you and i in the US plan our day around the scheduled load shedding at just about coffee hour? or purchase bottled water to shower in because the water's too dirty to touch? even on the DC Metro or MARC trains, the [relatively] frequent delays are not an important consideration in my daily planning.

the problem is generally that we couldn't possibly anticipate the things that cause our systems to deviate from intended performance.  and short of prophetic revelation, there's not a good way to do that.

there are, however cool ways to explore possibilities at the edge of the realm of possibility.

i have in mind things like robust decision making, info-gap analysis, portfolio evaluation, and modeling to generate alternatives.  some of these tools are older than i am (modeling to generate alternatives) but are only recently finding wider application via the explosion of bio-inspired computing (e.g., genetic algorithms, particle swarm optimization, ant colony optimization, genetic computing, etc.); while others are becoming established tools in the lexicon of risk and systems analysis even as we speak.

for example, info-gap analysis avoids restricting ourselves to decision contexts in which externally valid models can be identified in order to predict the future.  instead, info-gap computes the robustness and opportuneness of a strategy in light of one's wildest dreams [or nightmares] about the future.  in this way, one can be surprised not only about how bad things might turn out, but also how good they might be.

i personally am more partial to robust decision making, as it uses mathematics and terminology i am a bit more familiar with.  robust decision making enables us to evaluate circumstances in which either we can agree on an externally valid model of the future or we would like to discuss a range of competing interpretations and assumptions. one starts with a set of alternatives, and iterates through the futures that the strategies under consideration might suggest.  after a set of futures has been identified, the areas of future vulnerability are identified for the strategies being evaluated.  portfolio evaluation shares many similarities with robust decision making, in that the stakeholders project differing interpretations that may imply competing priorities for the decision context.

while in all of these techniques, it is generally agreed that one of the major weaknesses of existing risk and decision theory is the reliance on probability models that represent possible futures, i can't give up my addiction for Bayesian statistics.  at least when i decide i need rehab, there seems to be a sound selection of medications to choose from.

In this post, Behailu Bekera, a 2nd-year PhD student in the SEED group discusses the role of robust decision making under deep uncertainties.  This post was inspired by a reading of Louis Anthony Cox's "Confronting Deep Uncertainties in Risk Analysis" in a recent issue of Risk Analysis.

There is no one good model that can be used to assess deep uncertainties. Hence our decisions about complex systems or decision contexts are typically made based on insufficient knowledge about the situation. Deep uncertainties are characterized by multiplicity of future events and an unknown future. So, being able to precisely anticipate undesired events in the future and conducting the necessary preparations would be an example of a decision context with deep uncertainty. In this article, Tony offers recognition to ideas from robust optimization, adaptive control and machine learning that seem promising for dealing with deep uncertainties in risk analysis.

Using multiple models and relevant data to improve decisions, average forecasting, resampling data that allows robust statistical inferences despite model uncertainty, adaptive sampling and modeling, and Bayesian model averaging for statistical are some of the tools that can assist in robust risk analysis involving deep uncertainties.

The robust risk analysis techniques shift the focus of risk analysis from addressing passive aspects of it, such as identifying likelihood events (disruptions) and their associated [estimated] consequences to more action-oriented questions.  Active questions such as how we should act now to effectively mitigate the occurrence or consequences of events with highly undesirable effects in the future.

Robust decision making, for instance, is used by developing countries to identify potential large-scale infrastructure development projects and investigate possible vulnerabilities that require profound attention of all stakeholders.  Additionally, adaptive risk management may be used to maintain reliable network structure to ensure service continuity despite failures. This sort of techniques can be considerably important in the areas of critical infrastructure protection and resilience.

Through these emerging methods, Dr. Cox, makes important suggestions for making robust decisions in the face of extreme uncertainty in spite of our incomplete or inadequate knowledge.  This will be an important paper for those looking to advance the science of robust decision-making and risk analysis.