AI may be reshaping healthcare, but few people sit at the intersection of frontline medicine and product design as clearly as Dr. Kalie Dove-Maguire. An emergency physician turned health-tech leader, Kalie now leads product and strategy at Evidently — a company building transparent, trustworthy clinical decision support tools designed with one core principle in mind: AI should support clinical judgment, not override it.
In this new episode of How I Doctor, Offcall co-founder Dr. Graham Walker sits down with Kalie for an unfiltered conversation about how to build AI that physicians actually want in their workflow. Not just another layer of noise, flags, or frictions.
For Kalie, the turning point was a transplant case in residency where critical pulmonary hypertension data was overlooked simply because it wasn’t in the “Impression” section of a report. That moment — a needle in a haystack with devastating consequences — made her realize that the problem wasn’t judgment or effort. It was workflow.
Years later she started saw the opposite problem - workflows built by engineers with little sense of the cognitive demands or the needs for safety in clinical care. That disconnect pushed her to focus on tools that “show their work,” cite evidence, and preserve the clinician’s ability to decide what is true.
Kalie and Graham also dig into one of the most misunderstood topics in healthcare today: the difference between predictive models, LLMs, and decision support. As Kalie puts it, saying “I don’t like AI” is like saying “I don’t like applications.”
Kalie has spent nearly a decade slowly shifting from 80% clinical work to 80% product work. Her and Graham talk about navigating non-competes, keeping part-time shifts to maintain identity and confidence, and why clinicians should never fear closing a door.
This episode isn’t just about AI. It’s about judgment and the lived experience of building tools that respect how medicine really works. It’s a blueprint for how clinicians can shape the future of decision support from the inside — and why the best AI products begin with the realities of patient care, not software architecture.
Thank you to our wonderful sponsors for supporting the podcast:
Abridge - AI for clinical conversations: https://www.abridge.com/
Evidently - Leading AI-powered clinical data intelligence https://evidently.com/
Clinical judgment isn’t a vibe, a hunch, or a stylistic choice. It’s a clinician’s ongoing ability to detect inconsistencies, challenge assumptions, and recognize when something doesn’t add up. Kalie argues that judgment is fundamentally about not getting fooled. Not by the patient’s story, not by incomplete data, and not by system-generated suggestions. AI can assist with surfacing information, but the final decision remains human.
Evidently’s philosophy is simple: trust is earned, not given. Kalie describes building technology that always cites its sources, points back to original evidence, and never forces a diagnosis. When AI identifies conflicting data, it should surface both and then allow the clinician to decide what’s true rather than making the choice for them. Tools that respect judgment build adoption; tools that obscure reasoning erode it.
Predictive models, natural language tools, and LLM-based summarization all do fundamentally different things. Treating them as a single bucket leads to fear, misunderstanding, and misplaced expectations. Kalie breaks down how different models serve different workflow needs, and why the most powerful systems orchestrate these components rather than blending them into a mysterious black box.
Kalie’s decade-long journey from full-time EM to health tech shows that clinicians already have the pattern-recognition, problem-solving, and communication skills needed in product roles. The key is understanding the problem deeply, filling knowledge gaps intentionally, and maintaining transparency with colleagues and employers. Her advice is simple and surprisingly universal - take risks and protect your integrity.
On/Offcall is the weekly dose of information and inspiration that every physician needs.
To make sure you don’t miss an episode of How I Doctor, subscribe to the show wherever you listen to podcasts. You can also read the full transcript of the episode below.
Offcall exists to help restore balance in medicine and improve the wealth and wellbeing of physicians. Sign up for Offcall here to bring about more physician compensation transparency and join our physician movement.
Kalie Dove-Maguire: I think judgment is like being able to call BS on things. We do this on our colleagues, we do this on the patient, everybody. Our entire thing is like, do not get fooled. Don't get fooled by the system, the test, the patient, none of these things. Our goal is to seek the truth at the expense almost of everything else. It's that ability to call BS or flag when there's something that's abnormal, I think that's really what the clinical judgment is. So our technology goes in, reads the whole medical record and then packages it and summarizes it. When we present this for the first time, some people always ask us, well, is it making a diagnosis? And the answer is no. And so then they're like, well, what happens if there's conflicting diagnoses? Great question. It's going to tell you that both are present in the chart and then that's up to your judgment to decide what is true or false.
Join Offcall to keep reading and access exclusive resources for and by the medical community.
Offcall Team is the official Offcall account.