Why we won’t learn from Virginia Tech: The problem with hindsight bias
It’s that certainty, that knowing, that changes things, kills ambiguity, doubt and probability of any other ending to this series of events. It’s knowing what finally happened, that changes our memory of events. It’s a phenomenon that effects us constantly, continuously and unconsciously; it’s invisible.
This phenomenon has a name: hindsight bias. It’s also known as “creeping determinism” to better describe its relentless and unperceived affect. It’s the direct result of knowing the ending of a story or event that was totally ambiguous when it began. It turns out hindsight is not only 20/20; it is also a false eye chart of the past!
Uncertainty and ambiguity are the constant companions of our conscious processes, whether we are watching a movie or responding to a crime-in-progress. Who’s the good guy and who’s the bad guy? What twist will the plot take and how will the story end?
Have you noticed that watching a thriller is never as scary or exciting the second time you watch it…you know the ending! Our conscious mind deals with movies and real life the same way: by using the past (our memory) and predicting the future. The thing that makes a movie scary is imposing unexpected (or in some cases expected) events on the audience. In real life, we try to predict events based on past events, which is still the best way to guess what will happen.
Novel or unfamiliar circumstances force us to use associated memories, as close to the present context as possible to make predictions. Unfamiliar situations create greater anxiety and more likelihood of error. Therefore, our conscious mind searches for the familiar, sometimes finding the right action, sometimes the wrong one.
Still we generate a plan. Plans are called “remembered futures” and we always joke about having a plan so we have something to abandon in a crisis! Plans often fail us since we are merely predicting and cannot know what will happen. After an event has passed, it’s easy to see what we should have seen coming. Hindsight bias makes the should seem so obvious, when, in fact, it was not! Whether you’re the officer whose decision is being reviewed or the one doing the reviewing, understanding the phenomenon of creeping determinism is essential to understand what occurred, and to learn from the event.
One police action certain to be reviewed is a shooting. Assume you respond to a disturbance call and end up shooting subject A. You will immediately write or verbally report your interpretation of what occurred.
However, something has happened to your memory of the event in comparison to your having actually experienced the event. You’ve forgotten or reinterpreted all of the ambiguities you faced at the scene; creating a smooth flow-of-events that seem to inevitably lead to the shooting.
That is how you will probably write your report: as if all of the cues and all of the signs pointed toward a shooting and you were inexorably led to that end. You’ll forget the ambiguity created by subjects B, C, D and E, and their actions. You’ll forget your concerns with departmental policies. Certain doubts will be forgotten: doubt about your backup and how concerned your supervisor is with time spent on calls and all the other thoughts going through your mind as you walked into what was shortly to become a crime scene. All things not relevant to the final outcome will be dropped from the recollection of the experience.
Having served on shooting review boards, let me tell you one of the first questions asked: “Could the officer have avoided the shooting?” The correct answer is “maybe, but certainly the suspect could have!” This question assumes a timeline or plot-line similar to watching a movie for the second time, without doubt or uncertainty. That life is a linear equation with a modest set of variables, and simply changing X will lead to Y with certainty.
This ignores the dynamic nature of the real world, particularly actions involving human interaction. Streetwise administrators and investigators intuitively adjust for the effect of creeping determinism and know that shootings, like the tango, take two. But justifying or condemning the decisions of an officer aren’t the only functions of review boards. They serve as a feedback mechanism for organizational learning. The decisions and actions of officers must be reviewed in the context of the organizational environment. Does the department need to grow or adapt? Is there a gap between policy and supervision, training and the street or expectation and ability?
This same problem exists in case studies of officers killed. Did we learn the right lesson from the review? It’s so easy to judge and find flaws. Especially when we know the end of the story and think we have found the one variable to account for the failing of an officer (for we always assume to die is to fail). These single variable analyses leave us learning-disabled, since we fail to take into account the entire context of the situation. The officer did X, and we teach Y, which might have saved his life. No problem, he failed, we’re okay!
Perhaps . . . we must try to recreate the context of the incident to include the ambiguities and perspectives faced by the officer. This is what you must do if you are involved in a high profile incident, either as a participant or an investigator. Does the organization practice what it preaches and teaches? What environmental variables were present? What was known and unknown by the officer upon arriving? What were the human variables in terms of number, actions, etc.?
Maybe this is a good time to retrack the Virginia Tech shootings. What predictors would anyone have to expect a shooting two hours after a double homicide that looked like an enraged lover? How many disturbed young men do we have in this country that stalk without killing? The real lesson here is that we have so many unarmed police officers on campuses all over this nation. The learning point is to arm them, but that will not recognized thanks to hindsight bias.
Experts will all claim they could have easily predicted this tragedy that the police failed by not closing a campus the size of a city, and mostly blaming will be the focus of the mental energies of the people reviewing this terrible event! Mental health experts will claim to have easily been able to see this villain’s symptoms and stopped him long before he struck (as we if have some magical powers to detain folks we think might be a threat based on their isolation from others.) It reminds us of the science fiction movie, Minority Report, where people are arrested and convicted because they would have committed a crime in the future. If we only had more laws, more sensitivity classes, more counselors…just not more law enforcement officers with guns.
Here’s the lesson: On your calls, remember, all people are ambiguous, problematic; they must be attended to at some level. You do not know the ending of the next call and nothing can be assumed about anything that would cause you to let your guard and your awareness down. When you sit down to write the story of the incident you just experienced, you must account for creeping determinism. Remember, your memory will have eliminated the ambiguity and uncertainty. You need to recreate it in your mind and in your report. Review each decision you made along the way.
Ask yourself what other choices were available and what the context of the situation told you about the consequences of any one decision. What led you to anticipate that consequence for that action, and why would you seek or avoid it? Recall the actions of all the actors (good and bad) and record them as you perceived them the first time, without clear knowledge of who is good, bad or simply a witness. This mental naiveté may help trainers, investigators and administrators analyze incidents more effectively. It could make learning more dramatic for students and organizations. This mental approach allows for adjusting training and policies that may adversely affect the performance of our personnel. It also explains why decisions were made which often appear irrational in hindsight.
Officers entering every situation are weighing agency and supervisory issues, liability issues, safety issues and peer and observer pressures, knowing that this may be a completely novel and unique incident with risks they’ve never seen before. Police work is a delicate balancing of risk factors, and the effect of creeping determinism is to make the result seem predetermined, imminent and avoidable. Unfortunately, life isn’t so cut and dried.
All decisions contain risk. In law enforcement we make decisions attempting to mitigate or control those risks. Training, policies and experience all help or hinder us in any given situation. All of our life in the present. The now is novel and uncertain to some degree, yet we still try to create certainty.
Unfortunately, certainty only exists in the past, in our memories. In critical incidents the key is to understand what may have led to the officer's decision-making under the stress of the incident and learn from that. We should revere their sacrifice and be willing to examine everything that weighs on their decision making. This should include agency factors that might need correction. Humans err, organizations err and memory often interferes with the identification of the root causes of those mistakes. False certainty that something was obviously going to happen hinders our learning. We cannot live in a free society and eliminate risks of violence like we saw in Virginia. However, the pseudo-predictability of this event by all the talking heads and the fact that they know the ending of these events will prevent society and law enforcement from learning the real and important lessons of this event.
By understanding hindsight bias we can attempt to create the context and ambiguity of this crisis that we are analyzing. We will then be able to more effectively identify factors that lead to the decisions and subsequent actions. In doing this, we learn more, organizations adapt quicker and officers receive better training, policies and supervision. This ultimately gives us greater predictability in crises and improves our odds of winning high risk situations.
Donald A. Norman, Ph.D. (1988). THE PSYCHOLOGY OF EVERYDAY THINGS. New York: Basic Books.
Fischoff, B. (1975). HINDSIGHT- FORESIGHT: THE EFFECT OF OUTCOME KNOWLEDGE ON JUDGMENT UNDER UNCERTAINTY. Journal of Experimental Psychology: Human Perception and Performance, 1, 288-299.
Wildavsky, Aaron, Ph.D. (1988). SEARCHING FOR SAFETY. New Brunswick: Transaction Publishers.