Skip to content

Recently, NIST has published a new report titled "Further Development of a Conceptual Framework for Assessing Resilience at the Community Scale." I am happy to say that I was a co-author on this report with Alexis Kwasinski, Joseph Trainor, Cynthia Chen, and Francis Lavelle. It is my pleasure to share with you the abstract below:

The National Institute of Standards and Technology (NIST) is sponsoring the Community Resilience Assessment Methodology (CRAM) project. The CRAM project team is working in parallel with several other NIST initiatives, including: the Community Resilience Planning Guide for Buildings and Infrastructure Systems (https://www.nist.gov/el/resilience/community-resilience- planning-guides), the Center for Risk-Based Community Resilience Planning (http://resilience.colostate.edu/), and the Community Resilience Panel for Buildings and Infrastructure Systems (https://www.crpanel.org/). The objective of the CRAM project is to develop a foundation for assessing resilience at the community scale. For the purposes of this project, community resilience is defined as “the ability to prepare for and adapt to changing conditions and to withstand and recover rapidly from disruptions” (PPD-21 2013), and a community is defined as “a place designated by geographical boundaries that functions under the jurisdiction of a governance structure, such as a town, city, or county” (NIST 2015). This report continues the develop the concept of community dimensions and services and expands the concept to the dimensions of sustenance, housing and shelter, relationships, and education.

For this week's TWIST (This week in infrastructure systems) post, I want to do things just a bit differently and focus on a topic that is crucial for any infrastructure system: uncertainty framing.

Of course, it is very difficult to agree on how to define uncertainty, and once it's defined, it can be difficult to select robust tools for managing the types of uncertainties we see in infrastructure systems. Since infrastructures are characterized by long life cycles, large geographic and demographic scope, and substantial interconnections within and between lifeline systems, one wonders how any problems are selected for analysis. The web of intricacies faced by analysts and policy makers can be intractable, and the ways that the unknowns influence the likelihoods of the possible consequences makes every choice high-stakes. Some professionals call these problems "wicked," and prefer to "muddle-through" them, take a garbage can approach, or just admit that optimal solutions are probably not possible and accept the best feasible option--to our knowledge--at the time. Others call these "deep uncertainties" and even wonder whether resilience analysis is more appropriate than risk analysis for infrastructure systems.

However you choose to sort all that out, this issue is of critical importance to infrastructure enthusiasts today. In the US, we face a crisis of governance, in which the public trusts neither government nor experts, the center no longer holds--making it impossible to provide legislative/political stability for public engagement over the scientific debates, and our most important issues are fraught with uncertainties that make it impossible to provide an unequivocally recommended course of action. Of course, infrastructure is impossible without both strong governance and strong science (or trans-science, if you prefer). With that in mind, two articles stood out from Water Resources Research this week:

  • Rival Framings: A Framework for Discovering how Problem Formulation Uncertainties Shape Risk Management Tradeoffs in Water Resources Systems. In this paper, Quinn et al. explore how rival problem (read: uncertainty) framing could lead to unintended consequences as a result of inherent bias in the selected formulation. Of course, this is unavoidable for even modest problems in critical infrastructure systems, and so they provide some guidance for carefully exploring the possible consequences that can be foreseen under alternative problem formulations.
  • Towards best practice framing of uncertainty in scientific publications: a review of Water Resources Research abstracts. In this paper, Guillaume et al. describe how awareness of uncertainty is addressed within WRR abstracts/papers. They develop an uncertainty framing taxonomy that is responsive to five core questions: "Is the conclusion ready to be used?"; "What limitations are there on how the conclusion can be used?"; "How certain is the author that the conclusion is true?"; "How thoroughly has the issue been examined?"; and, "Is the conclusion consistent with the reader’s prior knowledge?". Of course, as the authors acknowledge, the study of uncertainty framing is inter-disciplinary, and achieving an uncertainty framing that is responsive to these questions is an art in itself.

Uncertainty, to me, is both fearsome and beautiful. I hope these two articles, or some of the other links shared, provide some useful thoughts for managing uncertainty in your own study or management of infrastructure systems.

The Exxon Mobil oil spill in the Gulf of Mexico this past year brought to light one of the most unfortunate aspects of the socio-technical systems that define our society. Because of the complexity and technical sophistication of our most critical infrastructures and crucial goods and services the parties responsible for making regulatory decisions are often not in possession of the data required to make decisions about risk mitigation and management that offer the most public protection, especially in the context of disaster response and risk management.  This becomes more of a problem when the environment in which these decisions are promulgated is characterized by a lack of trust between the regulator, the regulated, and third-party beneficiaries.

In an environment where trust exists between the regulated and regulator, opportunities for mutual collaboration towards broader social goals may be more prevalent.  These opportunities may also be more likely to be identified, formulated, and implemented in ways that my promote more trust and improve overall efficiencies both regulatory and economic. But when trust is broken, the adversarial nature of the regulatory relationship can bring gridlock.

We are very familiar with the image of gridlock in a transportation network from our time stuck in traffic during rush hour in many of our North American cities, and 2011 has made us more and more acquainted with partisan gridlock in Congress, but what about regulatory gridlock?  I am stil thinking this one through but am borrowing from the idea of economic gridlock developed by Daniel Heller to construct these ideas. In my opinion, regulatory gridlock occurs when, in an adversarial arrangement, the intended consequences of a complex technical system (CTS) are well known and integrated while the undesirable consequences of a CTS’s deployment are unpredictable and fragmentary.  The adversarial relationship makes it nearly impossible to facilitate effective communication between owners of the CTS that has failed and the stakeholders who are affected.  In addition, the adversarial relationship activates a feedback loop between perceived transparency of the CTS innovation cycle within the CTS ownership and the willingness of stakeholders to accept non-zero risk.  As this feedback loop promotes increased negative perception of transparency and decreased willingness to accept risk, risk mitigation becomes less economically effective while increasing the overall costs to society of CTS management and innovation.

In 2012, as economic and political pressure to make government more efficient and promote economic recovery increases, will we see the need for navigating this potential gridlock increase?  How will we address this challenge, ensuring that the potential for disasters doesn’t divert our focus from the important work of improving our economic and social welfare through technological innovation within our lifeline infrastructures?