When we think about how doctors make decisions, it can be easy for us to fall into the long-standing trap of believing that they approach decision-making like robots: they take all the data into account, weigh everything up carefully, and compute the best choice – every time, for every patient.
What we often forget in perpetuating this myth is this key fact: doctors are people, too. Therefore, their brains have the same hard-wiring as the rest of us non-doctors, and so are prone to the same decision-making traps as we are. Specifically: cognitive biases, or the mental shortcuts that our brains use to help us efficiently navigate the world. These biases impact everything from diagnosis, to treatment decisions, to treatment management, all of which are important factors in patient health outcomes.
This is critical to understand as health systems and other stakeholders work to improve patient care. In this article, we will review the biases that are most common in research into doctors’ decision-making, and provide recommendations for how to combat those biases.
Note that this is not an exhaustive or complete list – the same handful of biases have been studied extensively, while others that may also be relevant to clinical decision-making have not yet been investigated to the same degree. The actionability of this type of research can also be difficult to know, as there is rarely a direct link from bias to patient outcomes. Finally, keep in mind that these biases are not always negative – after all, they were developed as an adaptive strategy – so there may be cases in practice where the goal is in fact to reinforce the cognitive bias.
Caveats aside, let’s look at 6 biases for which there is significant empirical evidence in doctor decision-making.
Anchoring causes us to focus too much on the first piece of information we receive when making decisions. In medicine, this can result in incorrect diagnoses due to doctors over-weighting a first hunch about the cause of illness. It can also influence perceptions of medications – their evidence for safety and effectiveness may change over time, but doctors may anchor on initial findings and continue to either use or avoid them without updating their perceptions based on new information.
2. Availability heuristic
The availability heuristic makes us think that things that come to mind more easily are more likely to occur. For example, someone may be much more afraid of flying than driving because they can quickly imagine instances of deadly plane crashes, but these events are in fact many times less likely to occur than a car crash. However, due to its emotional and dramatic nature, the plane crash comes to mind more easily and so seems more probable.
For doctors, the availability heuristic influences both diagnosis and prescribing decisions, either by making physicians more or less likely to choose something they had recently encountered, regardless of whether it is appropriate for the current situation. One study showed that for doctors who had an atrial fibrillation (AFib) patient who had a bad reaction to warfarin, a very commonly prescribed medication, the odds of them prescribing warfarin to subsequent AFib patients were 21% lower, despite it still being an appropriate and indicated treatment for those patients.
3. Bandwagon effect
The bandwagon effect is a result of humans being social creatures. With this bias, the more we see other people doing something, and the larger the group of people doing it, the more we want to do it ourselves. It can show up in medicine when another healthcare professional has already labeled the patient with a certain diagnosis, or attributed certain social issues to them, and these labels or attributions are not questioned. This can lead to missed or incorrect diagnoses.
4. Confirmation bias
Confirmation bias causes us to look for and listen to information that reinforces what we already believe, and to ignore information that contradicts our beliefs. This can be part of the reason why changing someone’s mind by arguing rationally or presenting facts is so difficult!
For doctors, this can lead to incorrect diagnoses because once they have their hunch, they look to prove it, rather than continuing to search for other options. For treatment decisions, it can reinforce potentially sub-par habits due to doctors continually looking for evidence that the treatment choice was a good one, and never actively searching for evidence that the treatment was not optimal.
5. Overconfidence bias
Overconfidence bias leads us to believe that we are more skilled and capable than we actually are. This may show up for doctors as an unwillingness to admit that they don’t know or aren’t sure of something, which is often reinforced by the culture in medicine. This can lead to incorrect diagnosis or treatment decisions that go unquestioned or unchecked.
6. Status quo bias
Status quo bias makes it so that we feel most comfortable when we simply continue what we were doing before, leveraging the immense power of inertia. This is the driving force behind habitual behaviors, including everything from the treatments doctors prefer to prescribe to their adherence to hygiene protocols.
Status quo bias is a common example of a cognitive bias that is not just fought against in interventions, but actually leveraged – simply by designing the decision to make the preferred option the easiest one, requiring the least effort (or even none at all!).
How to Combat Cognitive Biases in Medicine
While there is still much research to do to determine the best ways to “de-bias” medical decision-making, there is some evidence for a handful of techniques that may help. Due to the unconscious nature of cognitive biases, simple training to make doctors more aware of these biases is typically insufficient to make change, though it can be a helpful first step to ensure they recognize the need for other interventions to optimize decision-making. More structural and immediate interventions have been found to be more effective overall.
This means to “think about your thinking.” Examples include deliberately slowing down; giving an estimate of how confident you are to catch yourself being overconfident; considering the opposite possibility or other alternatives, especially to combat anchoring; explaining your reasoning to ensure it is sound and give yourself a chance to slow down.
Modifying EMR and other standard systems – Studies have shown that changing default settings in EMR systems can have strong impact on physician behavior. For example, one study found that the most effective way to ensure the fewest appropriate number of opioid pills were prescribed was by making that the default option. This was even more effective than not including a default at all and requiring doctors to choose a number themselves.
Checklists & decision aids – Checklists have been shown to be extremely helpful in situations with set procedures that must be completed every time, such as in surgery. Decision aids help to optimize more complex treatment decision-making so that it can be most appropriate for the patient, or even to help reduce unnecessary tests and procedures (see the Choosing Wisely website for an example).
Various virtual games have been developed to help teach non-biased thinking in many fields, including in medicine, and have been shown to be both enjoyable and effective.
When estimating quantitative values, for example a patient’s weight or renal function, dialectical bootstrapping can help. This method asks you to assume your first quantitative answer was wrong, and guess again – the average of the two is usually much more accurate than the first answer alone.
Remembering that, as was said at the beginning, doctors are people too! This means, then, that doctors are prone to cognitive error. Reframe these errors as opportunities to learn and grow, rather than as failures.
For more information about decision-making in health, please contact Suzanne Kirkendall.