See how emergency medicine residents used DDx to prepare for their oral board exams — and what the data showed.
Overview
Morristown Medical Center's Emergency Medicine Residency Program evaluated DDx, an AI-driven case learning platform by Sketchy, as a supplemental tool for oral board preparation. Over 13 weeks, 30 EM residents (PGY-1 to PGY-3) completed assigned DDx cases alongside traditional oral board sessions. Pre- and post-intervention surveys assessed engagement, confidence, comfort, cognitive load, usability, and study habits — providing one of the most structured evaluations of DDx in a residency setting to date.
Overview
The challenge
The residency program sought a scalable case-based learning solution that would allow emergency medicine residents to practice clinical reasoning and diagnostic decision-making through realistic patient encounters while supplementing traditional oral board preparation.
Emergency medicine oral board exams require residents to think out loud, manage uncertainty, and demonstrate systematic reasoning under scrutiny. Traditional preparation — case books, study partners, self-directed review — is inconsistent, time-consuming, and rarely simulates the real exam experience well. Traditional oral board preparation also requires multiple faculty examiners and, in many programs, access to simulation labs — making it resource-intensive to deliver at scale. Program directors needed a scalable, accessible solution that could supplement in-person sessions while consistently building structured clinical reasoning.
The solution
DDx provided important accessibility and resource savings compared to traditional oral board preparation. By enabling residents to practice asynchronously and at scale, DDx reduced reliance on faculty time and specialized facilities — ensuring more consistent preparation opportunities and offering meaningful scheduling advantages for the residency program. Stress levels were notably lower with DDx compared to traditional oral board review sessions.
Assigned 2 DDx cases per week for 13 weeks (26 total cases assigned); Residents completed cases asynchronously alongside 3 in-person oral board sessions; Pre- and post-surveys measured engagement, confidence, comfort, cognitive load, and usability; performance was analyzed by case completion volume and residency year to identify dose-response effects.
Cases Included:



The results
Confidence in managing undifferentiated cases improved by +0.6 on a 5-point scale — a moderate, educationally meaningful effect. Residents who completed 15 or more cases showed the greatest benefits, with those completing 15-20 cases showing the largest confidence gains (+1.0 on the 5-point scale). 87% of residents felt that AI-based educational tools would be a useful supplement to their training, and 80% rated DDx as engaging or very engaging. PGY-1 residents showed the largest average confidence gains across all residency years.
Testimonials
The Morristown pilot provided meaningful evidence for DDx's role as a scalable and accessible complement to traditional oral board preparation. While DDx reinforces clinical reasoning effectively, the study also identified opportunities to further enhance oral board readiness — including the development of residency-level cases designed to incorporate higher cognitive load, time pressure, and examiner-style formatting. Those cases are now in development, and tailoring DDx content to this level of training is expected to yield even greater improvements in both clinical reasoning and oral board preparation outcomes.
Contents