The American filmmaker Ken Burns received the Peabody Award in 2013 for his powerful documentary Central Park Five. The film tells a story of a horrible crime committed in 1989. A New York Times critic described the film as follows: “A notorious crime—the rape of a jogger in Central Park in 1989—is revisited in this painful, angry, scrupulously reported story of race, injustice and media frenzy”.
The term “Central Park Five” was given to the group of young men who were arrested, tried and convicted for the violent assault and rape of a woman jogger in New York City’s Central Park. The case was sensationalized in the media. The New York Times described the attack as “one of the most publicized crimes of the 1980’s,” in which five teenaged, black men roamed through the park attacking people. The jogger was a young, white woman working in New York as an investment banker. The woman was beaten so badly that she was not expected to live. After the young men’s trial, they were sent to prison. Justice was served!
Or, so it was thought at the time…
The handling of the case against the Central Park Five is a classic illustration of the disastrous effects of cognitive biases on decision making. Years later, a man stepped forward and confessed to the crime. Burns’ film depicts that the police should have connected Matias Reyes to the crime, since DNA evidence identified him as the sole contributor of semen found in and on the rape victim. The film offers a haunting depiction of the various press reports, which clearly established the guilt of the young black men in raping the white woman. It made sensational reporting and likely sold more newspapers, yet was inaccurate. Justice was clearly not served since these five young men spent years in prison for a crime they did not commit. The City of New York settled the case with the young men for $41 million in 2014. It is easy to speculate how economic and racial biases contributed to so many people getting this case so wrong.
Consequences of Cognitive Bias
While those factors may well have been present, another form of bias – cognitive bias – was clearly present. Cognitive bias is the term used to describe an unintended consequence of routine functioning of the brain. According to Daniel Kahneman, the brain uses two systems. System One is what triggers our automated responses to routine functions. It allows us to spot circumstances and interpret the situations based on past experiences, and rapidly respond based on what worked in the past. System Two is the slower and more thoughtful function of the brain, which we commonly refer to as “thinking.” While System One is a part of thinking, it is so automated that we are unaware of it happening. Our daily lives are made much easier by System One, and the clear majority of the time it functions just fine. For example, if you think about all the activities you performed this morning, such as waking and getting up, most were managed by System One. Your getting dressed, getting coffee, heading off to work or wherever you spent the mornin – most are accomplished by automated, System One thinking. It works well, until it does not.
Cognitive biases describe those situations in which System One thinking misreads the circumstance. The perception is that this event is like a similar event, which in turn dictates the appropriate response. Since the perception is wrong, so too is the response. Usually, this is of little consequence and can even be humorous. Unfortunately, there are times when such misreads and subsequent responses have negative responses. Daniel Kahneman was awarded the Nobel Prize in 2002 for his application of cognitive biases in economics, which is called behavioral economics. His work describes how biases lead to bad economic decisions. Cognitive biases are also found in flawed business strategies and capital projects. Further, most industrial accidents and disasters have cognitive biases as at least a partial cause.
The Cure is High Reliability Organizing
High Reliability Organizing (HRO) principles describe the methods created to counteract, or at least lessen, the risks from cognitive biases. Some industries like aeronautics, the US military and nuclear power, have successfully used these methods to creat high reliability organizations. Healthcare in the US is now actively using some form of these same organizing principles. HRO principles alter how organizations design the conversations, orientations to pivotal roles and plans for interrupting the biases that naturally occur.
A classic example is an FOD walk on a US Navy aircraft carrier. FOD stands for Foreign Object Damage. The FOD walk involves people from the ship, who do not work on the flight deck. These individuals walk together down the flight deck looking for anything odd or unusual that could possibly damage an aircraft. The reason for the FOD walk is the people who routinely work on the flight deck would be accustomed to these anomalies, e.g., a loose oil can, a frayed wire, and could no longer see these items. If gone unnoticed, such objects could pose serious risks to those working on the flight deck, jeopardizing safe flight operations.
High Reliability Organizations are designed to investigate the unusual with a tenacious commitment to preventing failure. It shapes the mindset. If you think about it, this is starkly different than the mindset in most organizations, where routine is the order of the day based on expectations of “we’re good” and things should work as planned.
It’s hard to know from watching one documentary film, but it does sound as if the New York Police and District Attorney organizations were NOT functioning as HROs in 1989. Had these organizations been following HRO principles, it is probable that jumping to conclusions, rushing to make press announcements and missing lots of clues that challenged the theory of the case would not have happened. Five innocent young men would not have spent years in prison for a crime they did not commit.
Think about your organization and its leadership. Is it designed to deliver high reliability and resilience? Can it avoid rushing to faulty judgements and then avoiding evidence that contradicts these conclusions? If you’re not sure of the answer, then be prepared to face the harsh consequences.
Do you want to learn more about transformation change leadership? Download our whitepaper: ‘Transformation Change Leaders: The Biggest Missing Ingredient in Business Today‘
In it learn:
- What is driving the “gap” that exists in Boards of Directors and leadership teams
- The 6 main components of transformation change leadership
- What is causing the shortage of supply