by Bazerman and Watkins
Grade: B (filled with great anecdotes)
Were the earth-shattering events of September 11, 2001, predictable, or were they a surprise? What about the collapse of Enron in bankruptcy and scandal? Max H. Bazerman and Michael D. Watkins argue that they were actually "predictable surprises"--disastrous examples of the failure to recognize potential tragedies and actively work to prevent them. Disturbingly, this dangerous phenomenon has its roots in universal human and organizational tendencies that leave no individual or company immune.
In this book, Bazerman and Watkins, leading experts in managerial decision making, show that many disasters are preceded by clear warning signals that leaders either miss-or purposely ignore. They explain the cognitive, organizational, and political biases that make predictable surprises so common in business and society, and outline six danger signals that suggest a predictable surprise may be imminent. They also provide a systematic framework that leaders can use to recognize and prioritize brewing disasters and mobilize their organizations to prevent them.
Filled with vivid accounts of predictable surprises in business and society across public and private sectors, this book highlights a phenomenon that holds grave consequences-and challenges leaders to find the courage to act before it's too late.
According to Bazerman and Watkins, the five reasons we are most likely to be "surprised" are:
- We tend to have positive illusions that lead us to conclude that a problem doesn't exist or isn't severe enough to merit action.
- We tend to interpret events in an egocentric manner. That is, when considering the fairness of proposed solutions to a looming crisis, we allocate credit and blame in ways that are self-serving.
- We overly discount the future, reducing our courage to act now to prevent some disaster that we believe to be quite distant.
- We tend to maintain the status quo, and refuse to accept any harm that would bring about a greater good. In other words, we are reluctant to accept that some dramatic change will occur if we fail to address a mounting problem. Rather than confronting unpalatable choices, we avoid action altogether.
- Most of us don't want to invest in preventing a problem that we have not personally experienced or witnessed through vivid data. Thus, far too often, we only fix problems after we ourselves experience significant harm or after we can clearly imagine ourselves, or those close to us, in peril.