Skip to content

Blog

At this year's European Safety and Reliability Association (ESRA) annual meeting, ESREL 2013, Dr. Francis presented a discussion paper co-authored with GW EMSE Ph.D. student Behailu Bekera on the potential for using an entropy-weighted resilience metric for prioritizing risk mitigation investments under deep uncertainty.

The goal of this metric is to build upon tools such as info-gap decision theory, robust decision making approaches, scenario and portfolio analysis, and modeling to generate alternatives in order to develop resilience metrics that account for a couple ideas.  First, we felt that if an event is very extreme, it would be quite difficult to prepare for that event whether or not the event was correctly predicted.  Thus, resilience should be "discounted" by the extremeness of the event.  Second, we felt that if an event is characterized by an uncertainty distribution obtained through expert judgment, the degree to which the experts disagree should also "discount" the resilience score.  In our paper, the goal was to present the entropy-weighted metric we'd developed in our RESS article to the ESREL audience in order to engender some discussion about how to evaluate resilience under these conditions.  This work was inspired by a talk Dr. Francis attended by Woody Epstein of Scandpower Japan in PSAM11/ESREL12 Helsinki.

The paper and short slides now appear on the publications page of the SEED research blog.

In view of recent events concerning Edward Snowden and some of my more recent thinking about chemicals policy in the United States, I've been trying to understand how we've reached such an uncomfortable situation in protecting Americans' Constitutional rights to privacy.  While I don't know much about privacy law, the Patriot Act, or the Constitution, I do think some ideas in chemicals regulation can help explain what's going on with the privacy debate.

I have been reading and chewing on some passages in Brickman, Jasanoff, and Ilgen's Controlling Chemicals: The Politics of Regulation in Europe and the United States, and the differences between European approaches and American approaches to chemical regulation are so striking I couldn't help but speculate whether those factors contribute to the privacy challenges we are now facing.

Since the book was written in 1985, take these thoughts with a grain of salt. However, the observation Brickman et al. made which had me thinking goes something like this: because of America's laissez faire approach to business, unparalleled levels of access to legal and administrative proceedings, and commitment to protection of individuals' rights, American chemicals policy has complexity and cost unparalleled by that in other industrialized democracies. (I try to keep these posts short, so I can't break that down further.)

The key concept in that thought making Snowden, Manning, and other issues of transparency so ironic is the commitment we make to transparency and protection of individual interests. Perhaps we are so upset and appalled by the violations of our Constitution (or at least admitted lying to Congress by NSA staff) is that we are so used to openness and privileged access afforded few of the world's other citizens to their governments. Just a thought.

So, environmental policy wonks, don't get so frustrated in the face of gridlock. Apparently, that's one of the many costs of "freedom". 😉

 

I have been away from writing on the blog, even my personal opinions on current research topics (OK, that's what almost all of this writing is) due to travel, deadlines, and other obligations.  I do want to take an opportunity to announce that a new paper from the SEED research group co-authored by Dr. Francis and Behailu Bekera has just been accepted for publication in the journal Reliability Engineering and System Safety.  I am very excited about this, because I enjoy reading articles from this journal, and have found this research community engaging and interesting in person, as well as on paper.  I'll write a more "reflective" entry about this sometime later, but if you'd like to take a look at the paper, please find it here.  We will be presenting an earlier version of this work as a thought piece at ESREL 2013.  More on the conference paper closer to the date of the conference in October.

As spring break approaches even in the wake of 5-10" of snow expected in the Baltimore/DC region, I am feeling that a slightly more personal message is warranted.  I have been thinking about the many ways my students are being assailed on all sides by midterms, papers, presentations, and projects, and what this all means for their learning and long-term personal development.  [That is what education is about, right?]  I feel the key to this is storytelling.  All of us have learned our most important lessons through stories, but then we come to university and the stories stop.  Some of this is appropriate, I guess, as part of the point of university is to figure out how to tell your own story.  But in trying to teach fundamental truths about the way things work--these ideas should be framed in stories.  So often, we fail to do so.  I'm not sure what the penalty will be for us as a society, but I have some hint of why this alienates so many trainees.  Consider this excerpt from my personal blog this week:

Consider the nonfiction you have read recently. Very likely, the author was appealing to your reason with facts you could objectively verify. Although your interpretation was free for you to shape, you were probably looking at things as an outsider or a judge. Now, think about a fiction book you have read. Although the author may or may not have been doing the same thing-appealing to your reason-you were probably much more likely to see yourself as a character in the story. At the very least, you could empathize with the characters and take on their perspectives as they developed. As a result, what happened in the story also feels like it happens to you as well.

Perhaps this also happens to some extent in biography. But the point I’m trying to make is that the fiction method of teaching, if you will, is much more effective because fiction is processed by the heart first, while nonfiction is processed by the mind. Thus, you will have forgotten the story well before the lesson stops working in your soul. To remember important truths communicated as stories is much simpler because you can remember the feeling. Whereas facts require you to master the prose.

And this is the challenge in teaching or studying engineering, mathematics, and science.  Most people who engage these ideas as beginners can't find themselves in the story.  They don't view the equations they're memorizing or struggling with as the conversation that it is.

Nothing is working in their soul.

And the problems they are asked to solved aren't compelling because those problems don't affect them.  In fiction, everything that happens to the characters happens to you: in engineering training, what happens at best happens to an object-at worst, to some abstract variable appearing in some equation the student didn't create.

So, why am I writing this? Why am I bringing up fiction on my research/teaching page?  Because we need to find out how to make everything in the university a story when students and researchers first interact with it.  Our trainees will "remember" their instruction, because they can "remember the feeling." And since everything that happens will have happened to them, they will be in a much better place to move the conversation forward.

this post is about measuring ignorance.

yes, that sounds weird, but in systems engineering it's a very big deal now.  as our complex engineered systems progress in their reliability, we are trying as hard as possible to make sure that we can anticipate all the ways they can possibly fail.  essentially, we are everyday trying to imagine the most outlandish environmental/geopolitical/economic/technical disasters that can happen, put them all together at the same time, and determine whether our systems can withstand that.

naturally, one expects our military to do this as a matter of course: but your humble civil engineer? not sure that's crossed too many folks' minds.  at least not until Fukushima, Deepwater Horizon, and now Hurricane Sandy occurred. now, everyone are wondering how in the world we allowed our systems to perform so poorly under those circumstances.

the problem is not necessarily the safety or reliability of the systems: how often do you and i in the US plan our day around the scheduled load shedding at just about coffee hour? or purchase bottled water to shower in because the water's too dirty to touch? even on the DC Metro or MARC trains, the [relatively] frequent delays are not an important consideration in my daily planning.

the problem is generally that we couldn't possibly anticipate the things that cause our systems to deviate from intended performance.  and short of prophetic revelation, there's not a good way to do that.

there are, however cool ways to explore possibilities at the edge of the realm of possibility.

i have in mind things like robust decision making, info-gap analysis, portfolio evaluation, and modeling to generate alternatives.  some of these tools are older than i am (modeling to generate alternatives) but are only recently finding wider application via the explosion of bio-inspired computing (e.g., genetic algorithms, particle swarm optimization, ant colony optimization, genetic computing, etc.); while others are becoming established tools in the lexicon of risk and systems analysis even as we speak.

for example, info-gap analysis avoids restricting ourselves to decision contexts in which externally valid models can be identified in order to predict the future.  instead, info-gap computes the robustness and opportuneness of a strategy in light of one's wildest dreams [or nightmares] about the future.  in this way, one can be surprised not only about how bad things might turn out, but also how good they might be.

i personally am more partial to robust decision making, as it uses mathematics and terminology i am a bit more familiar with.  robust decision making enables us to evaluate circumstances in which either we can agree on an externally valid model of the future or we would like to discuss a range of competing interpretations and assumptions. one starts with a set of alternatives, and iterates through the futures that the strategies under consideration might suggest.  after a set of futures has been identified, the areas of future vulnerability are identified for the strategies being evaluated.  portfolio evaluation shares many similarities with robust decision making, in that the stakeholders project differing interpretations that may imply competing priorities for the decision context.

while in all of these techniques, it is generally agreed that one of the major weaknesses of existing risk and decision theory is the reliance on probability models that represent possible futures, i can't give up my addiction for Bayesian statistics.  at least when i decide i need rehab, there seems to be a sound selection of medications to choose from.

[this post also appears on my personal blog... Happy New Year! ]

Timer's on...

I enjoy teaching. It is the most enjoyable part of my job as a professor.

I also hate teaching. It is the most difficult part of my job as a professor.

I love teaching because it is a conversation. I have the opportunity to learn from experienced professionals how the theory developed in my field is being applied in practice. I have a chance to understand how people learn in a practical way. I have the opportunity to connect with young minds and other young people seeking guidance for implementing their ideas.

I hate teaching because I have to assign grades. These grades are often a barrier to the conversation I love to have. These grades often reward behavior that can be counterproductive to discovery. I hate teaching because I don't always understand my students' motives. Exercising authority in this context can lead to adversarial relationships that engender unhealthy emotions and stress.

Teaching is a challenge that is always on my mind. It is an arena I always look forward to stepping into. And, it is a crucible that brings up the dross to be removed, and sharpens insights I didn't know I had. Teaching is my real teacher.

In this new year, let us not just be thankful for our teachers, but let us also be thankful for opportunities to teach.

Godspeed and blessing in 2013.

In this post, Behailu Bekera, a 2nd-year PhD student in the SEED group discusses the role of robust decision making under deep uncertainties.  This post was inspired by a reading of Louis Anthony Cox's "Confronting Deep Uncertainties in Risk Analysis" in a recent issue of Risk Analysis.

There is no one good model that can be used to assess deep uncertainties. Hence our decisions about complex systems or decision contexts are typically made based on insufficient knowledge about the situation. Deep uncertainties are characterized by multiplicity of future events and an unknown future. So, being able to precisely anticipate undesired events in the future and conducting the necessary preparations would be an example of a decision context with deep uncertainty. In this article, Tony offers recognition to ideas from robust optimization, adaptive control and machine learning that seem promising for dealing with deep uncertainties in risk analysis.

Using multiple models and relevant data to improve decisions, average forecasting, resampling data that allows robust statistical inferences despite model uncertainty, adaptive sampling and modeling, and Bayesian model averaging for statistical are some of the tools that can assist in robust risk analysis involving deep uncertainties.

The robust risk analysis techniques shift the focus of risk analysis from addressing passive aspects of it, such as identifying likelihood events (disruptions) and their associated [estimated] consequences to more action-oriented questions.  Active questions such as how we should act now to effectively mitigate the occurrence or consequences of events with highly undesirable effects in the future.

Robust decision making, for instance, is used by developing countries to identify potential large-scale infrastructure development projects and investigate possible vulnerabilities that require profound attention of all stakeholders.  Additionally, adaptive risk management may be used to maintain reliable network structure to ensure service continuity despite failures. This sort of techniques can be considerably important in the areas of critical infrastructure protection and resilience.

Through these emerging methods, Dr. Cox, makes important suggestions for making robust decisions in the face of extreme uncertainty in spite of our incomplete or inadequate knowledge.  This will be an important paper for those looking to advance the science of robust decision-making and risk analysis.

Recently in a SEED group paper discussion, we re-visited the words of Stan Kaplan in his 1997 "The Words of Risk Analysis."  This is a transcript of a plenary lecture delivered to an Annual Meeting of the Society for Risk Analysis.  SEED Ph.D. student Vikram Rao presents some highlights from this article that I'm sure you'll enjoy as well.

I enjoyed the Kaplan article. I liked how it was written in an informal style which helped the article flow well and was easy to understand. It was a good introduction to risk analysis. It asked the three questions needed for a risk assessment – What can happen?, How likely is it?, and What are the consequences? These constitute the risk triplet. The diagrams were helpful, especially the dose response curve and the evidenced based approach.  And the diagrams that explained the decision making process with QRA were especially useful to get an overview of the whole process.

We need to recognize that risk assessments are going to be a bigger part of our decision making process considering the complexity of systems today. Systems such as aircraft, cars, and microprocessors have so many parts that the complexity is even bigger than before. Mitigating risks is a key to having successful complex systems. We need to be able to identify risks and have strategies for overcoming them. We can do this by eliciting expert opinions, doing simulations, and increasing our knowledge base.

We also see the rise in risks that have consequences on a national and global scale, such as global warming and climate change. By recognizing that effective risk mitigating strategies have vast importance today we can prepare ourselves well for the challenges of the future.

Today's plenary lunch included two interesting, high-level talks on several different dimensions of public health risk. While MacIntyre was focused on bioterrorism, Flahault was more wide ranging, with a general vision for changes in public health systems.

I have been fascinated on my international travels the last few weeks on the diversity of approaches to risk internationally. While you should by no means generalize from my remarks, it seems like there are a couple camps that focus on behavior modification and regulation, while others focus on the role of individual agents as key proponents in hazard exposures. In addition, engineers approach problems quite differently from basic scientists and they from social scientists and government agents. Of course there is much overlap. As a result, the foci in the technical presentations can vary quite widely.

I would say that engineers and basic scientists use scenario based approaches such as PRA, fault trees, and influence diagrams in their studies; more social science inclined professionals focus on the role of institutions in risk management and framing. Although we speak the same language at 30000 feet, the diversity in the details is truly fascinating.

My pressing question is How do we get folks involved in this earlier in life? How do we discuss the world of risk in a way that kids and young adults see the drama involved in finding out dangers and uncertainties germane to modern and global life?

Today, Dr. Francis is giving a talk titled "Two Studies in Using Graphical Model for Infrastructure Risk Models" discussing some recent peer-reviewed conference papers given at ICVRAM and PSAM11/ESREL12.  The abstract for today's talk is:

In this talk, I will discuss the use of Bayesian Belief Networks (BBNs) and Classification and Regression Trees (CART) for infrastructure risk modeling.  In the first case study, we focus on supporting risk models used to quantify economic risk due to damage to building stock attributable to hurricanes. The increasingly complex interaction between natural hazards and human activities requires more accurate data to describe the regional exposure to potential loss from physical damage to buildings and infrastructure. While databases contain information on the distribution and features of the building stock, infrastructure, transportation, etc., it is not unusual that portions of the information are missing from the available databases. Missing or low quality data compromise the validity of regional loss projections. Consequently, this paper uses Bayesian Belief Networks and Classification and Regression Trees to populate the missing information inside a database based on the structure of the available data. In the second case study, we use Bayesian Belief Networks (BBNs) to construct a knowledge model for pipe breaks in a water zone.  BBN modeling is a critical step towards real-time distribution system management.  Development of expert systems for analyzing real-time data is not only important for pipe break prediction, but is also a first step in preventing water loss and water quality deterioration through the application of machine learning techniques to facilitate real-time distribution system monitoring and management.  Our model is based on pipe breaks and covariate data from a mid-Atlantic United States (U.S.) drinking water distribution system network. The expert model is learned using a conditional independence test method, a score-based method, and a hybrid method, then subjected to 10-fold cross validation based on log-likelihood scores.

This talk is hosted by Ketra Schmitt in the Center for Engineering in Society on the Faculty of Engineering and Computer Science.