When people on the cardiac team (OR staff, ICU nurses, PAs, etc) fail to speak up, it almost always is a signal of an underlying problem The most effective way for any team to achieve its goals is by gaining broad participation of its members. The main way any individual member participates is by speaking up. This input can be about a broad array of topics. In the OR, it is generally about bringing up ideas that might improve a patient’s outcome. Outside the OR (e.g. a debriefing session), it is usually about team members admitting to errors they made so that performance is improved for the next patient. One of the basic tenants of a high reliability organization is that it seeks input from those with the most expertise and not just those that are highest in rank. If any person in the OR has an idea that could help accomplish that outcome, that idea should be heard and acted upon. Not doing so and having a bad outcome as a result meets the definition of preventable harm.
Aristotle was the first to recognize that power of teams is derived from the collective wisdom of its members, regardless of their expertise. Experts make decisions quite differently than those less experienced. Someone with vast prior experience to draw on tends to consider new experiences in light of past cases. They lean heavily on pattern recognition and how problems were solved before. That has the advantage of speeding up decisions, which can provide a crucial advantage on responding to the types of emergency situations that define the success of a cardiac surgical team. However, there are times when a slower, more methodical strategy of thinking through things is more helpful. Even during these times, experts often fail to go through the exercise of a formal differential diagnosis that might allow them to better appreciate alternatives. In other words, the experts aren’t always the best at figuring out what is really going on. The ability for different team members to think about issues more systematically illustrates why it can be so helpful to foster the participation from the entire team.
Teams veer off path rapidly when those that have helpful solutions don’t share their ideas. Surgeons under pressure to make a quick decision can suffer from a number of cognitive biases that reduce the quality of those decisions. Often times the other team members see those biases and their adverse impact much more clearly than the surgeon. At the extreme levels of dysfunction, a team dynamic called groupthink can settle in. This is a mindset where people tend to know that things aren’t going in the right direction but fail to speak up. People on teams locked onto the wrong direction often don’t speak out either because they second guess themselves (someone must have brought this idea up before) or don’t want to be thought of as a troublemaker (I’ll just go along to get along). More importantly, failing to speak up about an idea that could have helped a patient takes a personal toll on team members. After an adverse event, sometimes even the most battle-hardened staff will feel guilt, a sense of culpability, anxiety, alienation, detachment from their patients and colleagues. When this happens, that event has caused a “second victim”. We all went into medicine with the purest of intentions and loftiest goals. When wounds to the second victim penetrate deep, they create irreversible scars. When pure intentions conflict with a scarred reality, a cognitive dissonance is created within us that is only resolved by distorting our goals. We either quit medicine or (even worse) morph into someone that continues to work in the field, but is a bit more cynical about patient safety. Through a well described process known as normalized deviance, that cynical mindset soon becomes the new norm and the next preventable harm hurts us a little less (at least on the surface).
If speaking up is the antidote to prevent both patients and team members from becoming a victim of an adverse event, then why is it so difficult to gain full the participation of staff? I believe there are three reasons. First is that many teams do not create an environment where people are feel free to speak up. HR literature calls this “psychological safety”. A member of any team takes an intellectual risk by speaking up during times of difficulty. It is easier to do this in a culture where people feel free to share their true opinions and have a strong sense of curiosity about how to improve outcomes. This culture can take months/years to develop and can be wiped out by single disruptive event. The collateral injury is damage to the team’s resilience and psychological defenses, which makes them worry that their only “reward” for speaking up will be retaliation.
The second reason is that speaking up doesn’t change anything – there is no follow-through about whether the idea was implemented and if not, why not. When someone on the team brings up what they think is a good idea and no one acts on that idea, it discourages future similar efforts at bringing up ideas. Even if there is no personal risk or if the person is able to overcome their fear of this risk, why bother to speak up if nothing ever happens?
The first two reasons have the same cure: effective team meetings led by someone who manages the group’s interactions and consistently turns good ideas into meaningful change. But even when this is the case, there still remains a third reason that tends to resist even the most effective meetings. It is a reluctance to speak up when you aren’t really sure. Psychologically safe environments are a double edged sword. It spreads around the accountability for a high risk surgery in a way that can make some quite uncomfortable. The dilemma that team members feel is how to work through the mental calculus of “damned if you do or damned if you don’t”. Guilt can be felt by the team member not only when they don’t speak up but also by speaking up, having their idea implemented and hurting the patient.
Teams that perform cardiac surgery are not alone in their struggle with this third reason to not speak up. Any industry that deals in high risk activities – e.g. crews that operate a nuclear power plant, control the activities on the deck of an aircraft carrier, or communicate within the cockpit of a commercial jet – have team members that are uncertain about their role in a crisis. The difference for all high risk fields other than medicine is that their own personal lives are on the line. Concerns about feeling guilty after speaking up fades fast when not doing so creates the possibility that you might die. Maslow’s hierarchy of needs ranks being alive far higher than trying to avoid guilt. No matter how bad a cardiac surgical case goes or how poorly the team communicates, I’ve never seen any team member die in the OR. So this 3rd issue has a much stronger effect in medicine than other fields.
Lifting the threat of our own death frees us to ruminate on the question of guilt. The mental calculus of damned if you do/don’t can be a bit more complicated that it might first appear. It is human nature to categorize any bad event in one of two ways. It can be something that takes away from your current level, in other words as a “loss” or error of commission. The second way is a failed opportunity to gain something above your current level , i.e. “forgone gain” or error of omission. Having to hold yourself accountable for an idea that didn’t work out (a loss) creates more guilt than not speaking up in a way that could have helped (a forgone gain).
There are tricks for how to get around this third reason for not speaking up. Most important is to directly call on individual staff members that might have helpful input. That takes the decision away about whether to speak up and “gives permission” to team members to be heard. Every surgical case has natural pause points – e.g. the timeout and signout at the beginning and the end, the debriefing session. These are the moments where less experienced team members may have major contributions that could change the outcome of the case. If these contributions are are encouraged and recognized during these moments, they are more likely to recur. Finally, ask the team to regularly reaffirm their commitment to their patients by speaking up. This can increase their accountability to that idea in a way analogous to taking an oath prior to testifying in court. The purpose of this oath is not only to warn the witness of the consequences of lying but also to hold over their head their own sacred assurance to speak only the truth. It is an a priori way of at least temporarily strengthening our morals that has shown to have a well established influence on actually telling the truth. Similarly, I review the ground rules of our team debriefing sessions and make a statement at the timeout of each case that aim to encourage team input.
The idea that framing events as a loss vs. forgone gain changes someone’s willingness to take risks is a well-established principle of social science that won the Nobel Prize in 2002. My hunch is that it is the most common reason why ideas are held back among those on my cardiac team. The first step in correcting this problem is to be honest about its role on our own behavior.