Skip to content

This week, I wanted to put together some thoughts on the National Flood Insurance Program (NFIP) gathered from across the web. Actually, this is the only story I've wanted to write about, but I haven't sat down for my #TWIST note in about a month.

In fact, I'm not sure there has been any story worth looking into except hurricane impacts on communities and infrastructure when we're talking about infrastructure and infrastructure resilience. Hurricanes Harvey and Irma are crucial because, if nothing else, they demonstrate to us that community resilience has everything to do with the communities' and local government leaders' capacity to respond in the moment and adapt to future possibilities. While it is important to build infrastructure that can withstand a variety of challenges, there are two things we must consider when planning for resilience. First, infrastructure is almost impossible to adapt after it is built. One can harden existing infrastructure, but infrastructure is almost, by definition, completely un-adaptive. Second, the radically de-centralized way in which infrastructure is owned and built--especially in the United States (and I include buildings in infrastructure, which many researchers do not)--makes it nearly impossible to forecast the types of loads that individual systems will be called on to respond to.

Well, I don't want to go too far into that direction. However, I did want to share some stories about the NFIP because we are going to need to call on this program more frequently and deeply in the future. What are the major issues? Is the program vulnerable? Are folks who rely on the program vulnerable? What kind of losses will it be called on to insure in the future? Hopefully, a few of the articles/resources below can shed some light on the state of the NFIP as we enter into new climate realities.

  • Irma, Harvey, Jose, Katia: The Costliest Year Ever? Bloomberg asks whether Harvey et al will be among the costliest disasters ever. A snapshot from their article demonstrates that, globally, American hurricanes are responsible for five of the top 10 most costly events--in terms of insured losses. Where will this year's hurricane season rank?

    The ten most costly disasters in terms of insured losses (in Billions).
    The 10 most costly global disasters in terms of insured losses (in Billions). Source: https://www.bloomberg.com/graphics/2017-costliest-insured-losses/
  • Hurricane Sandy Victims: Here’s What ‘Aid’ Irma and Harvey Homeowners Should Expect. While it is critical to re-authorize NFIP and help to ensure that families receive the aid they need, it is unclear whether NFIP in its current form can deliver that assistance. Writing in Fortune Magazine describing the efforts of a group called Stop FEMA Now to promote awareness about some of the major shortcomings (as they see it) of NFIP, Kirsten Korosec writes:

Stop FEMA Now is a non-profit organization that launched after flood insurance premiums spiked as a result of the Biggert-Waters Act of 2012, inaccurate or incomplete FEMA flood maps, and what it describes as questionable insurance risk and premium calculations by actuaries, according to the group.

  • The NAIC has published a very interesting report that shows that, in the average year, NFIP is self-supporting. While in most years it pays out fewer claims that it receives in premiums, catastrophes are well beyond their capability to pay and NFIP must rely on borrowing. Consider Figure 1 from their report:
    Difference between NFIP premiums and claims per year.
    Difference between NFIP premiums and claims per year. Source: <http://www.naic.org/documents/cipr_study_1704_flood_risk.pdf>

    Do you see what they say in those two paragraphs after the figure?! First, note that NFIP must have its borrowing authority reauthorized by Congress before Sep. 30 (it has been extended to Dec. 8), and that it is already $25 billion in debt. Second, note that the NFIP has not priced its policies at "market rates," making NFIP unable to cover losses from major catastrophes. Even with these artificially low rates, vulnerable parties do not purchase the insurance!

  • Finally, J. Robert Hunter writes in the Hill about the fact that NFIP originally contained long-range planning in the legislation. Nonetheless, communities are not enforcing the land-use provisions contained in the law:

When I ran the NFIP in the 1970s, I saw a far-sighted idea that Congress put into action. Congress brilliantly embedded long-range planning into the program: in exchange for subsidies for flood insurance on then existing homes and businesses, communities would enact and enforce land use measures to steer construction away from high-risk areas and elevate all structures above the 100-year flood level. Only pre-1970s structures would be subsidized.

Clearly, from the snippets I've placed here for you, NFIP is in trouble. This is the story. How much longer can we afford to ignore the state of NFIP as a major tool for supporting community resilience?

This week, Russia's cybersecurity test lab, conversation theory powering educational innovation through MOOCs, P3s for transportation investment, and revised economic benefits of the WOTUS rule:

  • How a Country Became Russia's Test Lab for Cyber War in Wired Magazine. This article presents a very interesting account of how Russia has been perfecting their techniques and tactics by wreaking cyber-havoc on Ukraine's infrastructure systems. Are we ready to protect ourselves? Many are not so sure. This same week, a Black Hat survey was referenced in ASCE's infrastructure SmartBrief that shows that experts are somewhat concerned about our vulnerability to these attacks. Consider this excerpt from the article describing a Ukranian cyber-security researcher's first-hand view of one of these attacks: "Noting the precise time and the date, almost exactly a year since the December 2015 grid attack, Yasinsky felt sure that this was no normal blackout... For the past 14 months, Yasinsky had found himself at the center of an enveloping crisis. A growing roster of Ukrainian companies and government agencies had come to him to analyze a plague of cyberattacks that were hitting them in rapid, remorseless succession. A single group of hackers seemed to be behind all of it. Now he couldn’t suppress the sense that those same phantoms, whose fingerprints he had traced for more than a year, had reached back, out through the internet’s ether, into his home."
  • Conversation Powers Learning at Massive Scale in IEEE Spectrum. Massively open online courses (MOOCs) have been challenging traditional education infrastructures in uniquely challenging ways. However, are the pedagogically sound? Do they work with the ways that people learn? Are they effective in light of what we know about human development and human learning? MOOCs and other online delivery modes form the backbone of emerging "personalized learning" platforms, and promise the potential of tailoring every aspect of student experience to their individual paces and abilities. The authors of this argue acknowledge that while personalized learning promises great potential (as indicated by substantial investments into R&D), "It can be done, with difficulty, for well-structured and well-established topics, such as algebra and computer programming. But it really can’t be done for subjects that don’t form neat chunks, such as economics or psychology, nor for still-evolving areas, such as cybersecurity." IEEE Spectrum describes how FutureLearn has based their work on Pask's Theory of Conversation, which views human learning as discursive. There are hints of similar ways of thinking in Paulo Freire's Pedagogy of the Oppressed and Etienne Wenger's Communities of Practice. FutureLearn's experiences may provide some insight into developing MOOCs that are not just, as the authors say, "lectures at a distance."
  • Gas Tax Hikes vs. Public Private Partnerships (P3s) in Forbes. One of my pet peeves is listening to Americans complain about our infrastructure systems, but ultimately rejecting any scheme required to pay for it. This is true with respect to health care, education, science/R&D, and of course, lifelines like electric power/water/sanitation/communications. One of the ways we have considered getting around this political ambivalence as a nation has been considering private involvement in lifeline public infrastructures. One of the classic venues of this debate has been how to pay for transportation infrastructure upgrades: tax increases, tolls, or privatization. While taxes have been the traditional approach, this article provides some food for thought: "There is--public private partnerships for roads, which often take the form of toll roads administered by private companies. Private sector investment in infrastructure is desirable because it takes taxpayers off the hook for construction, operation and maintenance of transportation assets and ensures that those who don’t use them aren’t paying for them."
  • Trump Slashes WOTUS Economic Benefits in E&E News. Ariel Wittenberg reports that "U.S. EPA and the Army Corps of Engineers are disputing their own economic analysis of the 2015 Clean Water Rule, now saying most benefits they previously ascribed to the Obama-era regulation can no longer be quantified." The Waters of the United States rule was intended to provide clarity concerning which waters were covered by federal jurisdiction under the Clean Water Act. The Obama administration promulgated the rule in 2015, but it has been the subject of controversy and confusion, to say the very least. As the EPA currently acknowledges, "This is important because all Clean Water Act programs—including tribal and state certification programs, pollution permits, and oil spill prevention and planning programs—apply only to 'Waters of the United States.'" The 2015 rule is almost certainly going to be changed, as it has not yet been implemented due to court stays, and the Trump administration has directed the EPA and the Army Corps "to review the existing Clean Water Rule for consistency with [the priorities of economic growth, minimizing regulatory uncertainty, and showing due regard for the Constitutional roles of the States and Congress] and publish for notice and comment a proposed rule rescinding or revising the rule, as appropriate and consistent with the law. Further, the Order directs the agencies to consider interpreting the term "navigable waters," as defined in 33 U.S.C. 1362(7), in a manner consistent with the opinion of Justice Antonin Scalia in Rapanos v. United States, 547 U.S. 715 (2006)."

Today, a massive achievement in chemicals reform was accomplished on Capitol Hill. The Senate voted to overhaul the Toxic Substances Control Act (TSCA) of 1976. The original act was written in the '70s when environmental problems were outrageous and fixes were obvious, the interpretation of risk information was different, the acceptability of testing practices was not questioned, and the process for evaluating chemicals was seen as legitimate.

Needless to say, none of these things are true in 2016.

In fact, overhaul of TSCA was seen as critical as early as the late 1980s. There are tens of thousands of chemicals in commerce, yet the evaluation process is so slow that only a handful are evaluated each year. Yes, you read that correctly. In an age where we now have computational toxicology, structure-activity relationships, and other approaches to predictive toxicology that can streamline the evaluation process and use fewer animals to do it at less expense, we evaluate only a handful of chemicals each year.

Part of this is due to the process involved. In the United States, we protect private property stringently. As a result, we tend to have high barriers to intervention in commercial affairs. Part of the way this plays out in chemicals regulation is placing the burden of proof for the need for chemical information collection and evaluation on the government. This can extend the timeline and increase the cost of chemical evaluation.

Moreover, we are now aware that chemicals can interact unpredictably in human and environmental systems (read: bodies or organisms). In addition, where in the 70s the general public believed there was such a thing as a level at which no health effects might occur, the general public now acknowledges there is no level of exposure at which zero health effects can occur.

The point of all this being that, as complex as chemical evaluation was in the '70s, it is more complex now. At the same time, we have much more powerful tools. We need the impetus to investigate the use and interpretation of these tools. We need to bring TSCA into the 21st century.

There has been a lot of attention given to this issue this week. On Twitter, search the hashtag #FixTSCA or visit www.fixtsca.org to see a broad array of voices on this topic. The New York Times has hailed Congressional action on TSCA. And on the Diane Rehm show this morning, the dean of the GW Milken School of Public Health, Lynn Goldman, who has been involved in the fight to reform TSCA for over 20 years, helped to explain the significance of this important achievement [podcast here].