Photo Credit: wavebreakmedia
The following is a summary of “Evaluating the Quality of Narrative Feedback for Entrustable Professional Activities in a Surgery Residency Program,” published in the December 2024 issue of Surgery by Fernandes et al.
Competency-based medical education requires continuous formative assessments and feedback on learners’ performance.
Researchers conducted a retrospective study to evaluate the quality of narrative feedback in a surgery residency program.
They analyzed 3,900 entrustable professional activity (EPA) assessments from the Surgical Foundations curriculum at Queen’s University (2017–2022), 2 raters independently assessed the quality of narrative feedback using the Quality of Assessment of Learning score (0–5).
The results showed that 57% (2229/3900) of the assessments contained narrative feedback, with a mean Quality of Assessment of Learning score of 2.16 ± 1.49. Of these, 72.4% (1614/2229) detailed the resident’s performance, 42.7% (951/2229) included suggestions for improvement, and 22.4% (499/2229) connected suggestions to the evidence. No significant change in feedback quality was observed over time (r = 0.067, P=0.002). Lower feedback quality was associated with attending physicians (2.04 ± 1.48) compared to medical students (3.13 ± 1.12, P<0.001) and clinical fellows (2.47 ± 1.54, P<0.001), assessments completed more than 1 month after the encounter (1.85 ± 1.48 vs. 2.23 ± 1.49, P<0.001), and residents not entrusted with performing the EPA (2.13 ± 1.45 vs. 2.35 ± 1.66, P=0.008). No significant difference in feedback quality between direct and indirect observations (2.18 ± 1.47 vs. 2.06 ± 1.54, P=0.153).
They concluded that narrative feedback quality in surgery residency was fair, with no meaningful improvement over 5 years.