Guide to Heuristics

This guide will give you more information about cognitive dispositions to respond (CDRs) that can lead to diagnostic errors.

Aggregate bias – when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are invoking the aggregate fallacy. The belief that their patients are atypical or somehow exceptional may lead to errors of commission, such as ordering x-rays or other tests when guidelines indicate that none are required.

Anchoring – the tendency to perceptually lock onto salient features in the patient’s initial presentation too early in the diagnostic process and fail to adjust this initial impression in light of later information. This may be severely compounded by the confirmation bias.

Ascertainment bias – occurs when a physician’s thinking is shaped by prior expectation; stereotyping and gender bias are both good examples.

Availability heuristic – the disposition to judge things as being more likely, if they readily come to mind. Thus, recent experience with a disease may inflate the likelihood of its being diagnosed. Conversely, if a disease has not been seen for a long time, it may be under diagnosed.

Base-rate neglect – the tendency to ignore the true prevalence of a disease, either inflating or reducing its base-rate and distorting Bayesian reasoning.  In some cases,  clinicians may inflate the likelihood of disease, such as in the strategy of ruling out the worst-case scenario, to avoid missing a rare but significant diagnosis.

Bayes theorem- a theorem described mathematically by the equation, Odds pre-test x Likelihood ratio = Odds post-test. The key clinical reasoning point of Bayes theorem is to highlight that the pre-test odds (probability = odds/odds+1) or probability can dramatically affect the post-test odds. Thus, pre-test odds/probability should be considered before deciding whether to order a test or not, even if the test has very strong likelihood ratio’s positive or negative.

Commission bias –  results from the obligation toward beneficence and the belief that harm to the patient can only be prevented by active intervention. It is the tendency toward action rather than inaction. It is more likely in over-confident physicians and less common than omission bias.

Confirmation bias – the tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it, despite the latter often being more persuasive and definitive.

Diagnosis momentum – refers to the fact that once diagnostic labels are attached to patients, they tend to become stickier and stickier. Through intermediaries (patients, paramedics, nurses, physicians), what might have started as just a possibility gathers increasing momentum until it becomes definite, and all other possibilities are excluded.

Feedback sanction – the idea that making a diagnostic error may carry no immediate consequences, as considerable time usually elapses before the error is discovered, if it is ever discovered. Also, poor system feedback processes prevent important information on decisions getting back to the decision maker.

Framing effect – the concept that how diagnosticians see things may be strongly influenced by the way in which the problem is framed. For example, physicians’ perceptions of risk to the patient may be strongly influenced by whether the outcome is expressed in terms of patient survival. In terms of diagnosis, physicians should be aware of how patients, and other medical professionals frame potential outcomes and contingencies of clinical problems.

Fundamental attribution error – the tendency to be judgmental and blame patients for their illnesses rather than examine the circumstances that might have been responsible. In particular, psychiatric patients, minorities, and other marginalized groups tend to be targeted by this tendency.

Gambler’s fallacy – the belief that if a coin is tossed ten times and is heads each time, the 11th toss has a greater chance of being tails. An example would be a physician who sees a series of patients with chest pain in clinic or the emergency department, diagnoses all of them with an acute coronary syndrome, and assumes the sequence will not continue. Thus, the pre-test probability that a patient will have a particular diagnosis, as perceived by their doctor, might be influenced by preceding but independent events.

Gender bias – the tendency to believe that gender is a determining factor in the probability of diagnosis of a particular disease when no such pathophysiological basis exists. Generally, it results in an over diagnosis of the favored gender and under diagnosis of the neglected gender.

Hindsight bias – a bias related to the fact that knowing an outcome may profoundly influence the perception of past events. This prevents a realistic appraisal of what actually occurred. In the context of diagnostic error, it may compromise learning through either an underestimation (illusion of failure) or overestimation (illusion of control) of the decision maker’s abilities.

Multiple alternatives bias – a multiplicity of options on a differential diagnosis may lead to significant conflict and uncertainty. The process may be simplified by reverting to a smaller subset, but this may result in inadequate consideration of other possibilities. One such strategy is the three-diagnosis differential, choosing three of the most likely diagnoses to focus on at a time. Although this approach has some heuristic value, it will minimize the chances that certain, possibly correct, diagnoses will be made.

Omission bias – a tendency toward inaction that is rooted in the principle of non-maleficence. In hindsight, events that have occurred through the natural progression of a disease may be considered more acceptable than those that can be attributed directly to the action of the physician. Therefore, reinforcement is often associated with not doing anything, but this may be very detrimental to the patient. Omission biases typically outnumber commission biases.

Order effects – information transfer is a U-function: we tend to remember the beginning part (primacy effect) or the end (recency effect). Primacy effect may be augmented by anchoring. In transitions of care, when information is transferred from one set of providers to another, all information should be considered as equally as possible, regardless of the order in which it was presented.

Outcome bias – the tendency to opt for diagnostic decisions that will lead to good outcomes, rather than those associated with bad outcomes, thereby avoiding negativity associated with the latter. It is a form of value bias, because physicians may express a preference for what they hope will happen rather than for what they really believe might happen. This may result in serious diagnoses, especially those with a poor prognosis, being minimized.

Overconfidence bias – a universal tendency to believe we know more than we do. Overconfidence reflects a tendency to act on intuitions and experiential knowledge rather than scientific evidence. The bias may be augmented by both anchoring and availability, and catastrophic outcomes may result when there is a prevailing commission bias.

Playing the odds (frequency gambling) – the tendency in ambiguous presentations to opt for a common diagnosis over a rare one. It may be compounded by the fact that the signs and symptoms of many common and benign diseases are mimicked by more serious and rare ones. The strategy may be unintentional or deliberate and is diametrically opposed to the rule out worst-case scenario strategy (see base-rate neglect).

Posterior probability error – occurs when a physician’s estimate for the likelihood of disease is influenced by the patient’s history. Unlike the gambler’s fallacy, the physician is expecting a sequence to continue. For example, if a patient presents to the office five times with a headache that is correctly diagnosed as migraine on each visit, the physician might be tempted to diagnose migraine on the sixth visit. Common diagnoses for a particular patient often continue to be common for that patient, and the potential for a headache of another cause being diagnosed is lowered through posterior probability.

Premature closure –the tendency to apply prematurely end the diagnostic decision making process, accepting an illness script before it has been fully verified. This can lead to missed diagnoses.

Psych-out error – psychiatric patients appear to be particularly vulnerable to the heuristics and biases described in this list and to other errors in their management, some of which may exacerbate their condition. They appear especially vulnerable to fundamental attribution error. In particular, comorbid medical conditions may be overlooked or minimized. A variant of psych-out error occurs when serious medical conditions (e.g., hypoxia, delirium, metabolic abnormalities, CNS infections, head injury) are misdiagnosed as psychiatric conditions.

Representativeness restraint – the representativeness heuristic drives the diagnostician toward looking for prototypical manifestations of disease. However, restraining decision- making along these pattern-recognition lines leads to atypical variants being missed.

Search satisfying – reflects the universal tendency to call off a search once something is found. Comorbidities, second foreign bodies, other fractures, and coingestants in poisoning are examples of findings that could be missed.

Sutton’s slip -takes its name from Brooklyn bank-robber Willie Sutton who is alleged to have said he robbed banks ‘‘because that’s where the money is!’’ The diagnostic strategy of going for the obvious is referred to as Sutton’s law. The slip occurs when possibilities other than the obvious are not given sufficient consideration.

Sunk costs – the more clinicians invest in a particular diagnosis, the less likely they may be to release it and consider alternatives. This is an entrapment form of CDR more associated with investment and financial considerations. However, for the diagnostician, the investment is time and mental energy and, for some, ego may be a precious investment. Confirmation bias may be a manifestation of such an unwillingness to let go of a failing diagnosis.

Triage cueing – the triage process occurs throughout the health care system, from the self-triage of patients to the selection of a specialist by the referring physician. In the emergency department, triage is a formal process that results in patients being sent in particular directions, which cues their subsequent management. Many CDRs are initiated at triage.

Unpacking principle – failure to elicit (unpack) all relevant information in establishing a differential diagnosis may result in significant possibilities being missed. If patients are allowed to limit their history giving, or physicians otherwise limit their history taking, unspecified possibilities may be discounted.

Vertical line failure – routine, repetitive tasks often lead to thinking in silos—predictable, orthodox styles that emphasize economy, efficacy, and utility. Though often rewarded, the approach carries the inherent penalty of inflexibility. In contrast, lateral thinking styles create opportunities for diagnosing the unexpected, rare, or esoteric.

Visceral bias – the influence of affective sources of error on decision-making has been widely underestimated. Visceral arousal leads to poor decisions. Countertransference, both negative and positive feelings toward patients, may result in diagnoses being missed. Some attribution phenomena (fundamental attribution error) may have their origin in countertransference.

Yin-Yang out – when patients have been subjected to exhaustive and unavailing diagnostic investigations, they are said to have been worked up the Yin-Yang. The Yin-Yang out is the tendency to believe that nothing further can be done to find a diagnosis. Even if a definitive diagnosis may exist for the patient, the physician ceases further diagnostic effort. While sometimes a definitive diagnosis cannot be reached, adopting this attitude is sometimes not in the patient’s best interest.

License

2023-2024 M26 Introduction to Clinical Reasoning Syllabus Copyright © by Scott Epstein, MD and Robert Trowbridge, MD. All Rights Reserved.

Share This Book