Skip to content

This week, I wanted to put together some thoughts on the National Flood Insurance Program (NFIP) gathered from across the web. Actually, this is the only story I've wanted to write about, but I haven't sat down for my #TWIST note in about a month.

In fact, I'm not sure there has been any story worth looking into except hurricane impacts on communities and infrastructure when we're talking about infrastructure and infrastructure resilience. Hurricanes Harvey and Irma are crucial because, if nothing else, they demonstrate to us that community resilience has everything to do with the communities' and local government leaders' capacity to respond in the moment and adapt to future possibilities. While it is important to build infrastructure that can withstand a variety of challenges, there are two things we must consider when planning for resilience. First, infrastructure is almost impossible to adapt after it is built. One can harden existing infrastructure, but infrastructure is almost, by definition, completely un-adaptive. Second, the radically de-centralized way in which infrastructure is owned and built--especially in the United States (and I include buildings in infrastructure, which many researchers do not)--makes it nearly impossible to forecast the types of loads that individual systems will be called on to respond to.

Well, I don't want to go too far into that direction. However, I did want to share some stories about the NFIP because we are going to need to call on this program more frequently and deeply in the future. What are the major issues? Is the program vulnerable? Are folks who rely on the program vulnerable? What kind of losses will it be called on to insure in the future? Hopefully, a few of the articles/resources below can shed some light on the state of the NFIP as we enter into new climate realities.

  • Irma, Harvey, Jose, Katia: The Costliest Year Ever? Bloomberg asks whether Harvey et al will be among the costliest disasters ever. A snapshot from their article demonstrates that, globally, American hurricanes are responsible for five of the top 10 most costly events--in terms of insured losses. Where will this year's hurricane season rank?

    The ten most costly disasters in terms of insured losses (in Billions).
    The 10 most costly global disasters in terms of insured losses (in Billions). Source: https://www.bloomberg.com/graphics/2017-costliest-insured-losses/
  • Hurricane Sandy Victims: Here’s What ‘Aid’ Irma and Harvey Homeowners Should Expect. While it is critical to re-authorize NFIP and help to ensure that families receive the aid they need, it is unclear whether NFIP in its current form can deliver that assistance. Writing in Fortune Magazine describing the efforts of a group called Stop FEMA Now to promote awareness about some of the major shortcomings (as they see it) of NFIP, Kirsten Korosec writes:

Stop FEMA Now is a non-profit organization that launched after flood insurance premiums spiked as a result of the Biggert-Waters Act of 2012, inaccurate or incomplete FEMA flood maps, and what it describes as questionable insurance risk and premium calculations by actuaries, according to the group.

  • The NAIC has published a very interesting report that shows that, in the average year, NFIP is self-supporting. While in most years it pays out fewer claims that it receives in premiums, catastrophes are well beyond their capability to pay and NFIP must rely on borrowing. Consider Figure 1 from their report:
    Difference between NFIP premiums and claims per year.
    Difference between NFIP premiums and claims per year. Source: <http://www.naic.org/documents/cipr_study_1704_flood_risk.pdf>

    Do you see what they say in those two paragraphs after the figure?! First, note that NFIP must have its borrowing authority reauthorized by Congress before Sep. 30 (it has been extended to Dec. 8), and that it is already $25 billion in debt. Second, note that the NFIP has not priced its policies at "market rates," making NFIP unable to cover losses from major catastrophes. Even with these artificially low rates, vulnerable parties do not purchase the insurance!

  • Finally, J. Robert Hunter writes in the Hill about the fact that NFIP originally contained long-range planning in the legislation. Nonetheless, communities are not enforcing the land-use provisions contained in the law:

When I ran the NFIP in the 1970s, I saw a far-sighted idea that Congress put into action. Congress brilliantly embedded long-range planning into the program: in exchange for subsidies for flood insurance on then existing homes and businesses, communities would enact and enforce land use measures to steer construction away from high-risk areas and elevate all structures above the 100-year flood level. Only pre-1970s structures would be subsidized.

Clearly, from the snippets I've placed here for you, NFIP is in trouble. This is the story. How much longer can we afford to ignore the state of NFIP as a major tool for supporting community resilience?

Today, a massive achievement in chemicals reform was accomplished on Capitol Hill. The Senate voted to overhaul the Toxic Substances Control Act (TSCA) of 1976. The original act was written in the '70s when environmental problems were outrageous and fixes were obvious, the interpretation of risk information was different, the acceptability of testing practices was not questioned, and the process for evaluating chemicals was seen as legitimate.

Needless to say, none of these things are true in 2016.

In fact, overhaul of TSCA was seen as critical as early as the late 1980s. There are tens of thousands of chemicals in commerce, yet the evaluation process is so slow that only a handful are evaluated each year. Yes, you read that correctly. In an age where we now have computational toxicology, structure-activity relationships, and other approaches to predictive toxicology that can streamline the evaluation process and use fewer animals to do it at less expense, we evaluate only a handful of chemicals each year.

Part of this is due to the process involved. In the United States, we protect private property stringently. As a result, we tend to have high barriers to intervention in commercial affairs. Part of the way this plays out in chemicals regulation is placing the burden of proof for the need for chemical information collection and evaluation on the government. This can extend the timeline and increase the cost of chemical evaluation.

Moreover, we are now aware that chemicals can interact unpredictably in human and environmental systems (read: bodies or organisms). In addition, where in the 70s the general public believed there was such a thing as a level at which no health effects might occur, the general public now acknowledges there is no level of exposure at which zero health effects can occur.

The point of all this being that, as complex as chemical evaluation was in the '70s, it is more complex now. At the same time, we have much more powerful tools. We need the impetus to investigate the use and interpretation of these tools. We need to bring TSCA into the 21st century.

There has been a lot of attention given to this issue this week. On Twitter, search the hashtag #FixTSCA or visit www.fixtsca.org to see a broad array of voices on this topic. The New York Times has hailed Congressional action on TSCA. And on the Diane Rehm show this morning, the dean of the GW Milken School of Public Health, Lynn Goldman, who has been involved in the fight to reform TSCA for over 20 years, helped to explain the significance of this important achievement [podcast here].

When teaching about risk and uncertainty analysis, one of the questions I often have is "How do my students' worldviews influence their conceptualization of risk?" I thought one area to search in order to answer this question is the theological literature about risk. I felt that some of these scholars might have something to say about this problem, and I'd recently come across an edited volume by Niels Henrik Gregersen titled Information and the Nature of Reality, so I figured I might be able to start with him based on what I'd seen from those discussions.

Among the first articles I read was an article by Gregersen called "Risk and Religion: Toward a Theology of Risk Taking." [Gregersen (2003), Zygon 38(2), p. 355-376] I am not quite finished with the article, but it seems the he is suggesting changing the traditional approach to risk (i.e., Risk = Probability x Consequence) to explicitly indicate that risk is difficult to calculate since it includes both our evaluation of the dangers, and the compound events composed by our responses to external events that can occur (i.e., Risk = int(Pr[outcome|our decisions, external event]Pr[external event]) x Evaluation[outcome|our decisions, external event]... or something like this). His discussion then seems to indicate that since the risks we are faced with are increasingly related to our decisions and not the consequences of natural events alone, that is, second-order versus first-order risks, in the long-run a risk-welcoming attitude may be more virtuous than a risk-averse one.

While reading, I was impressed upon by so many ideas I wanted to immediately share with close friends who are familiar with my professional interest in risk. I would read a paragraph, then imagine my friend's response to it. Read another, imagine the response. This article had so much that I could easily see coming up in discussions around the dinner table or at the coffee shop. Ultimately, I felt compelled to stop and share with you all this excerpt publicly:

Risk and fate cannot be pitted against each another, because the former always takes place within the framework of the latter. Expressed in theological terms, the world is created by a benevolent God in such a manner that it invites a risk-taking attitude and rewards it in the long term. Risk taking is a nonzero-sum game. The gifts of risk taking are overall greater than the potential damages, and by risking one’s life one does not take anything away from others; the risk taker explores new territories rather than exploiting the domains of the neighbor. (p. 368)

It is possible that you reading this now do not share my theistic worldview. Nonetheless, we must remember that risk taking is a fundamental and worthy component of our human enterprise. I spend almost all my time studying, evaluating, and developing methods to reduce or plan for risks we want to avoid. Most of the time, these risks are imposed on others by the decisions of a third party. Sometimes, these risks are framed as questions of reliability in complex systems. But for me, they are rarely framed as venture.

For me, I think the nature of the risks we often employ professional advice to explore makes us likely to forget this. Risk can, and probably should, only be considered in the context of venture-a great gain deliberately pursued in view of an examined possibility of adverse consequences. While reading this article as part of a larger interest in understanding how worldview frames and addresses risk, I can't help feeling a bit uncomfortable about this statement. I agree with it 100%. Yet I feel we don't accept its implications. I think our public interest in risks associated with complex systems makes this challenging, and I don't have any good answers.

At this year's European Safety and Reliability Association (ESRA) annual meeting, ESREL 2013, Dr. Francis presented a discussion paper co-authored with GW EMSE Ph.D. student Behailu Bekera on the potential for using an entropy-weighted resilience metric for prioritizing risk mitigation investments under deep uncertainty.

The goal of this metric is to build upon tools such as info-gap decision theory, robust decision making approaches, scenario and portfolio analysis, and modeling to generate alternatives in order to develop resilience metrics that account for a couple ideas.  First, we felt that if an event is very extreme, it would be quite difficult to prepare for that event whether or not the event was correctly predicted.  Thus, resilience should be "discounted" by the extremeness of the event.  Second, we felt that if an event is characterized by an uncertainty distribution obtained through expert judgment, the degree to which the experts disagree should also "discount" the resilience score.  In our paper, the goal was to present the entropy-weighted metric we'd developed in our RESS article to the ESREL audience in order to engender some discussion about how to evaluate resilience under these conditions.  This work was inspired by a talk Dr. Francis attended by Woody Epstein of Scandpower Japan in PSAM11/ESREL12 Helsinki.

The paper and short slides now appear on the publications page of the SEED research blog.

Recently in a SEED group paper discussion, we re-visited the words of Stan Kaplan in his 1997 "The Words of Risk Analysis."  This is a transcript of a plenary lecture delivered to an Annual Meeting of the Society for Risk Analysis.  SEED Ph.D. student Vikram Rao presents some highlights from this article that I'm sure you'll enjoy as well.

I enjoyed the Kaplan article. I liked how it was written in an informal style which helped the article flow well and was easy to understand. It was a good introduction to risk analysis. It asked the three questions needed for a risk assessment – What can happen?, How likely is it?, and What are the consequences? These constitute the risk triplet. The diagrams were helpful, especially the dose response curve and the evidenced based approach.  And the diagrams that explained the decision making process with QRA were especially useful to get an overview of the whole process.

We need to recognize that risk assessments are going to be a bigger part of our decision making process considering the complexity of systems today. Systems such as aircraft, cars, and microprocessors have so many parts that the complexity is even bigger than before. Mitigating risks is a key to having successful complex systems. We need to be able to identify risks and have strategies for overcoming them. We can do this by eliciting expert opinions, doing simulations, and increasing our knowledge base.

We also see the rise in risks that have consequences on a national and global scale, such as global warming and climate change. By recognizing that effective risk mitigating strategies have vast importance today we can prepare ourselves well for the challenges of the future.

Today's plenary lunch included two interesting, high-level talks on several different dimensions of public health risk. While MacIntyre was focused on bioterrorism, Flahault was more wide ranging, with a general vision for changes in public health systems.

I have been fascinated on my international travels the last few weeks on the diversity of approaches to risk internationally. While you should by no means generalize from my remarks, it seems like there are a couple camps that focus on behavior modification and regulation, while others focus on the role of individual agents as key proponents in hazard exposures. In addition, engineers approach problems quite differently from basic scientists and they from social scientists and government agents. Of course there is much overlap. As a result, the foci in the technical presentations can vary quite widely.

I would say that engineers and basic scientists use scenario based approaches such as PRA, fault trees, and influence diagrams in their studies; more social science inclined professionals focus on the role of institutions in risk management and framing. Although we speak the same language at 30000 feet, the diversity in the details is truly fascinating.

My pressing question is How do we get folks involved in this earlier in life? How do we discuss the world of risk in a way that kids and young adults see the drama involved in finding out dangers and uncertainties germane to modern and global life?

This past week the Annual Meeting of the Society for Risk Analysis took place in Charleston, SC. Dr. Francis and collaborators George Gray, John Carruthers, and Robert Lee presented a paper titled "Preferences related to urban sustainability under risk, uncertainty, and dynamics." The abstract is included here:

Numerous older cities in the US are experiencing a state of decline, due to shrinking populations, economic hardship, and many other factors. Large areas of these cities are comprised of contaminated and vacant land. We explore the decision context around land redevelopment approaches focused upon reducing risk, improving quality of life, and fostering sustainability. Characterizing the preferences and objectives of diverse stakeholders in a multi-attribute framework may improve decisions and planning. However, traditional decision analytic approaches tend to be ‘static’, and do not capture the temporal and spatial dynamics of this problem. We propose a framework that integrates stated and revealed preferences in a dynamic modeling environment designed to capture key attributes of urban sustainability identified by stakeholders. The utility of this model will be demonstrated through an observational experiment. Key attributes and preferences will be elicited from a population of stakeholders in a Web environment. After eliciting these preferences, the participants will then engage in a dynamic modeling exercise in which they are able to interactively explore land use decisions considering the complexities of urban dynamics; the numerous tradeoffs, risks, and uncertainties; the resource constraints; and so on. We call this model DMASE (for Dynamic/Multi-Attribute/Spatially-Explicit). Preferences over the key attributes will then be elicited again. We hypothesize that the key attributes and preferences will change appreciably based upon interaction with the DMASE model. Additionally, the model can be modified in an iterative fashion to capture the decision context and preferences of the participants in a more meaningful way. This work will lead to a decision support tool that will allow stakeholders and decision-makers in declining cities to make more informed decisions about changes in the complex urban environment.