The near meltdown of the US financial system this week came as a surprise to most of us -- experts, legislators, and citizens alike. That isn't to say that the components of the disaster were unknown -- the subprime crisis, the earlier financial undoings of Fannie Mae and Bear Stearns this summer, and the sudden collapse of Lehman Brothers last week. But what has come as a surprise is the severity of the warnings by the Federal Reserve and Treasury that the entire financial system is only a few steps from seizure and collapse. This is a catastrophic system failure -- and no one would have anticipated its possibility six months ago.
Think of a few other surprises in the past thirty years -- the collapse of the Soviet Union, the Iranian Revolution, or the emergence of China as a roaring engine of market-based growth. In each case the event was a discontinuous break from the trajectory of the past, and it surprised experts and citizens alike. (The photo above depicts the surprising Yeltsin standing on a tank in 1991.)
So what is a surprise? It is an event that shouldn't have happened, given our best understanding of how things work. It is an event that deviates widely from our most informed expectations, given our best beliefs about the causal environment in which it takes place. A surprise is a deviation between our expectations about the world's behavior, and the events that actually take place. Many of our expectations are based on the idea of continuity: tomorrow will be pretty similar to today; a delta change in the background will create at most an epsilon change in the outcome. A surprise is a circumstance that appears to represent a discontinuity in a historical series.
It would be a major surprise if the sun suddenly stopped shining, because we understand the physics of fusion that sustains the sun's energy production. It would be a major surprise to discover a population of animals in which acquired traits are passed across generations, given our understanding of the mechanisms of evolution. And it would be a major surprise if a presidential election were decided by a unanimous vote for one candidate, given our understanding of how the voting process works. The natural world doesn't present us with a large number of surprises; but history and social life are full of them.
The occurrence of major surprises in history and social life is an important reminder that our understanding of the complex processes that are underway in the social world is radically incomplete and inexact. We cannot fully anticipate the behavior of the subsystems that we study -- financial systems, political regimes, ensembles of collective behavior -- and we especially cannot fully anticipate the interactions that arise when processes and systems intersect. Often we cannot even offer reliable approximations of what the effects are likely to be of a given intervention. This has a major implication: we need to be very modest in the predictions we make about the social world, and we need to be cautious about the efforts at social engineering that we engage in. The likelihood of unforeseen and uncalculated consequences is great.
And in fact commentators are now raising exactly these concerns about the 700 billion dollar rescue plan currently being designed by the Bush administration to save the financial system. "Will it work?" is the headline; "What unforeseen consequences will it produce?" is the subtext; and "Who will benefit?" is the natural followup question.
It is difficult to reconcile this caution about the limits of our rational expectations about the future based on social science knowledge, with the need for action and policy change in times of crisis. If we cannot rely on our expectations about what effects an intervention is likely to have, then we can't have confidence in the actions and policies that we choose. And yet we must act; if war is looming, if famine is breaking out, if the banking system is teetering, a government needs to adopt policies that are well designed to minimize the bad consequences. It is necessary to make decisions about action that are based on incomplete information and insufficient theory. So it is a major challenge for the theory of public policy, to attempt to incorporate the limits of knowledge about consequences into the design of a policy process. One approach that might be taken is the model of designing for "soft landings" -- designing strategies that are likely to do the least harm if they function differently than expected. Another is to emulate a strategy that safety engineers employ when designing complex, dangerous systems: to attempt to de-link the subsystems to the extent possible, in order to minimize the likelihood of unforeseeable interactions. (Nancy Leveson describes some of these strategies in Safeware: System Safety and Computers.) And there are probably other heuristics that could be imagined as well.