Home  /  ACEM Fellowship  /  Study notes  /  Cognitive Bias in Emergency Medicine Decision-Making

Cognitive Bias in Emergency Medicine Decision-Making

ACEM Fellowship LO ACEMF-PDM-3-TS4-4.2 2,074 words
Free preview. This study note maps to learning objective ACEMF-PDM-3-TS4-4.2 in the ACEM Fellowship curriculum. Inside Primex you get the full set of ACEM Fellowship notes, AI-graded SAQs and written-paper practice, voice viva with an AI examiner, exam-style MCQs, and a curriculum tracker that ticks off every learning objective as you go. For exam format, timeline and failure-mode commentary, see the FACEM Fellowship 2026 Study Guide.

ACEM Fellowship Learning Objective: Common Cognitive Biases Affecting ED Clinical Decisions


Overview: Why Cognitive Bias Matters in Emergency Medicine

The emergency department is arguably the highest-risk environment for cognitive error in all of medicine. Clinicians work under extreme time pressure, with incomplete information, high patient volumes, frequent interruptions, emotional loading, and significant diagnostic uncertainty. These conditions do not merely permit cognitive errors - they actively cultivate them.

Clinical decision-making occurs via two broadly recognised cognitive systems:

System Characteristics Strengths Vulnerabilities
Type 1 (Intuitive) Fast, automatic, pattern-recognition based, largely unconscious Efficient, low cognitive load, effective for experienced clinicians with familiar presentations Prone to heuristic shortcuts, anchoring, premature closure
Type 2 (Analytical) Slow, deliberate, logical, effortful Systematic, less error-prone in novel situations Impaired by fatigue, distraction, cognitive overload, time pressure

In emergency medicine, most decisions default to Type 1 processing - the experienced clinician pattern-matches a presentation to a familiar schema and acts. Most of the time this works well. The danger lies in the predictable, systematic ways this process fails. These failures are cognitive biases: reproducible, non-random deviations from rational decision-making.

Understanding these biases is not merely academic. Every cognitive error is - at least in theory - identifiable and potentially preventable. The ACEM candidate must be able to name, define, recognise, and mitigate the most clinically significant biases.


Major Cognitive Biases in Emergency Medicine

Anchoring Bias

Definition: The tendency to fix on an initial impression (the "anchor") and fail to sufficiently adjust that assessment as new information becomes available.

ED example: A 58-year-old man arrives with chest pain. The treating nurse hands over "musculoskeletal chest pain - came in after lifting at work." The treating clinician anchors to this label and minimises workup, missing an NSTEMI.

Why it's dangerous: The anchor is often set early - during triage, nursing handover, or the ambulance report - before the clinician has done their own assessment. Initial framing powerfully distorts subsequent interpretation.

Debiasing strategy: Consciously generate at least one serious alternative diagnosis before closing on a diagnosis. Ask: "What else could this be?"


Premature Closure

Definition: Accepting a diagnosis before it has been adequately confirmed; stopping the diagnostic process too early. Widely regarded as the single most common cognitive error in emergency medicine.

ED example: A patient with altered consciousness is diagnosed with alcohol intoxication. The clinician stops searching. Subdural haematoma or hypoglycaemia is missed.

Key insight: Premature closure frequently follows anchoring - the anchor sets the diagnosis, and premature closure ends the search. They are cognitively linked.

Debiasing strategy: Maintain diagnostic uncertainty until a coherent, internally consistent clinical picture emerges - including the history, examination, investigations, and trajectory over time.


Availability Bias

Definition: Judging the probability of a diagnosis based on how easily examples of it come to mind, rather than on its actual base rate.

ED example: A clinician who recently managed a patient with pulmonary embolism subsequently over-investigates every patient with dyspnoea for PE. Conversely, a clinician with no recent aortic dissection cases may systematically under-consider it.

Subtypes:

Subtype Description
Recent availability Bias driven by recent personal experience with a condition
Salience availability Vivid or emotionally striking cases are recalled more readily
Media availability High-profile cases in the media inflate perceived prevalence

Debiasing strategy: Apply pre-test probability tools and clinical decision rules rather than relying on intuitive frequency estimates. Separate "how memorable?" from "how likely?"


Framing Effect

Definition: The way in which information is presented (framed) alters decision-making, independent of the actual content of that information.

ED example: "This is a 45-year-old woman with anxiety and palpitations" creates a very different cognitive frame than "this is a 45-year-old woman with new-onset palpitations and exertional presyncope." The clinical facts may be identical; the physician's subsequent behaviour will differ substantially.

Why it matters in the ED: Framing is embedded in every triage note, ambulance handover, and nursing assessment. The emergency clinician is almost never the first person to interact with and label the patient.

Debiasing strategy: Actively reset the frame. Treat every patient as if encountering them for the first time; re-read the triage note after, not before, your own assessment.


Search Satisficing (Search Satisfaction)

Definition: The tendency to stop searching for pathology once something has been found - assuming that the identified abnormality explains the entire presentation.

ED example: A trauma patient with a right-sided rib fracture has a chest X-ray reported as "rib fracture, no pneumothorax." The haemothorax on the contralateral side is not noticed because the search ended after finding the expected abnormality.

Classic scenarios:

Scenario First finding Missed finding
Trauma One long-bone fracture Second fracture (or C-spine injury)
Toxicology Salicylate ingestion confirmed Missed co-ingestion
Radiology Obvious fracture on X-ray Concurrent dislocation or pathological lesion
Sepsis UTI identified as source Concurrent bacteraemia or alternative source

Debiasing strategy: Systematically complete the full assessment before closing - read the entire film, re-examine the full patient, consider whether all abnormalities are explained by the identified diagnosis.


Representativeness Bias

Definition: Judging the probability of a diagnosis based on how closely the patient resembles the "prototype" of that condition, while ignoring base rates and atypical presentations.

ED example: Aortic dissection "should" present with tearing interscapular pain in a tall hypertensive man. The 65-year-old woman presenting with syncope and mild back discomfort does not fit the prototype - and is missed.

Closely related - Atypical Presentation Blindness: This is a particularly important variant in emergency medicine, where MI, PE, stroke, and aortic pathology all have well-recognised atypical presentations that are frequently missed precisely because they don't match the clinician's internal template.

High-risk groups for missed representativeness bias: - Women presenting with ACS - Elderly patients with sepsis without fever - Diabetic patients with painless MI - Immunocompromised patients with atypical infection


Commission Bias (Action Bias)

Definition: The tendency to favour action over inaction, particularly in high-acuity situations.

ED example: Intubating a patient with moderate respiratory distress before attempting non-invasive ventilation, or administering thrombolytics for suspected PE before obtaining CT-PA confirmation, driven by a sense of urgency.

Why it matters: Emergency medicine has a significant cultural bias toward action ("do something"). This is appropriate in true time-critical emergencies but can cause harm when applied indiscriminately. Unnecessary interventions carry real risks - procedural complications, medication adverse effects, premature commitment to a diagnosis.


Omission Bias

Definition: The mirror of commission bias - favouring inaction because the harm of acting seems more salient than the harm of not acting.

ED example: Failing to thrombolyse a patient with massive PE because the clinician is paralysed by fear of bleeding complications, despite the haemodynamic instability clearly tipping the risk-benefit calculation toward treatment.


Affective Bias (Affective Heuristic)

Definition: Emotional state - of the clinician or evoked by the patient - directly alters clinical decision-making in ways that are not rationally justified.

Patient factor Affective distortion
VIP patient Over-investigation, over-treatment, deference to patient preferences even when clinically inappropriate
Homeless / intoxicated patient Anchoring to social diagnosis, reduced diagnostic rigour, inadequate analgesia
Patient resembles family member Emotional flooding, over-identification, impaired objectivity
Patient is hostile or aggressive Cognitive distancing, reduced empathy, minimisation of complaint
Clinician's personal emotional state Fatigue, moral distress, prior adverse event - all degrade analytical processing

Debiasing strategy: Maintain metacognitive awareness of emotional reactions to patients. Ask: "Am I thinking differently about this patient because of how I feel about them rather than what their clinical picture shows?"


Visceral Bias (Yin-Yang Bias)

Definition: The erroneous belief that a patient who has already undergone extensive investigation must have "nothing serious" - because if they did, it would have been found already.

ED example: A patient with ten prior ED visits for abdominal pain, all discharged without diagnosis, presents again. The clinician assumes this is "the same thing as always" - and misses the mesenteric ischaemia.

Why it's dangerous in the ED: Frequent attenders are at high risk for this bias. The clinical imperative is to ask: "Given this patient's history, what new or evolving diagnosis could present this way?"


Diagnosis Momentum

Definition: The tendency for a diagnosis to gain unquestioned authority as it is passed from clinician to clinician, becoming increasingly fixed regardless of the evidence supporting it.

ED example: A patient labelled "fibromyalgia flare" in the GP letter, repeated in the triage note, repeated in nursing assessment, is presented to the emergency clinician as "known fibromyalgia." The possibility that this presentation represents something new and serious is not entertained.

Debiasing strategy: Treat labels as hypotheses, not facts. Ask: "On what basis was this diagnosis made, and does it account for everything I'm seeing today?"


Sunk Cost Bias

Definition: Continuing down a diagnostic or management pathway because of investment already made in that path, rather than reassessing from first principles.

ED example: After three hours of workup for presumed viral illness, new clinical information suggests meningitis. The clinician is reluctant to pivot - they have already "committed" to the benign diagnosis.


Systemic Factors That Amplify Cognitive Bias in the ED

Factor Effect on cognitive processing
Fatigue Reduces Type 2 capacity; increases reliance on Type 1 heuristics
High patient load / time pressure Accelerates premature closure; promotes anchoring
Interruptions Disrupts analytical reasoning; promotes error in complex tasks
Noise and environmental chaos Depletes cognitive resources; impairs sustained analytical thinking
Serial decision-making Decision fatigue increases error rates across a shift
Team hierarchy Junior clinicians may defer to senior diagnoses (diagnosis momentum)
Electronic records with pre-filled templates Anchoring from auto-populated fields; search satisfaction in pre-ticked lists

Debiasing Strategies: Individual and System Level

Individual Strategies

Strategy Mechanism
Metacognition Consciously reflecting on one's own thought process - "What type of thinking am I using right now?"
Diagnostic timeout Deliberate pause before disposition to ask: "Does my diagnosis explain everything? What am I missing?"
Consider the alternative Explicitly generate 2-3 alternative diagnoses before committing
Forced reframing Consciously discard the received framing and reconstruct the clinical picture from raw data
Re-examination at reassessment Using the re-assessment as a second diagnostic opportunity rather than purely a safety check

System-Level Strategies

Strategy Example
Cognitive aids and decision support tools Validated clinical decision rules (HEART, PERC, Wells), embedded in workflow
Structured handover (e.g., ISBAR) Reduces framing effects in clinical communication
Checklists and safety nets Formalise the complete assessment before discharge
Morbidity and mortality review Identifies recurrent patterns of cognitive error at departmental level
Simulation-based training Builds metacognitive skills in a safe environment

ACEM Fellowship Implications

Written Paper Considerations

Cognitive bias is highly examinable in both the written and OSCE components of the ACEM Fellowship. In the written paper, questions may ask you to: - Identify the type of cognitive bias operating in a described clinical scenario - Discuss system and individual strategies to mitigate cognitive error - Relate cognitive bias to patient safety, adverse events, and near-misses in the ED

Be prepared to define each bias precisely and provide a specific, clinically credible emergency medicine example. Vague or anaesthetic/surgical examples will not demonstrate sufficient currency in emergency practice.

OSCE and Clinical Applications

In OSCE stations involving a diagnostic reasoning challenge, collapsed or undifferentiated presentations, or cases with a "misleading" framing (e.g., psychiatric patient with organic pathology, known addict presenting with pain), the examiner is specifically assessing whether the candidate: - Recognises when prior labelling may be distorting their assessment - Actively generates a broad differential rather than accepting the offered frame - Demonstrates a structured, systematic approach even under time pressure - Can articulate why a particular error might occur, not just that it occurred

Disposition and Safety-Netting

Cognitive bias does not end at diagnosis - it extends to disposition. Common errors include: - Anchoring on admission avoidance in a system under pressure, leading to inappropriate discharge - Availability bias inflating or deflating perceived risk at the time of discharge decision - Affective bias toward patients who are perceived as demanding or difficult

Safe discharge requires the same metacognitive vigilance as initial diagnosis. A structured reassessment, explicit safety-netting instructions, and a clear documented decision rationale are both clinical and medicolegal protections.

Second Victim and Institutional Context

Cognitive errors that result in adverse outcomes can produce profound professional and psychological distress - the "second victim" phenomenon. Emergency departments should maintain formal structures for: - Post-event peer debriefing - Case review to identify whether systemic cognitive error contributed - Supportive pathways (peer support, psychology, employee assistance) for affected clinicians

Recognition of cognitive bias is a patient safety imperative and a professional competency expected at the fellowship level.

Primex

Practice this topic in the app

Work through MCQs on this exact LO, run written or viva practice mapped to ACEMF-PDM-3-TS4-4.2, or ask PRIMEX a clinical question framed for ACEM Fellowship. Your free trial covers all 20 specialist exams.

Start 7-day free trial
Start free trial