TY - JOUR
T1 - Promoting Responsible Electronic Documentation
T2 - Validity Evidence for a Checklist to Assess Progress Notes in the Electronic Health Record
AU - Bierman, Jennifer A.
AU - Hufmeyer, Kathryn Kinner
AU - Liss, David T.
AU - Weaver, A. Charlotta
AU - Heiman, Heather L.
N1 - Publisher Copyright:
© 2017 Taylor & Francis Group, LLC.
PY - 2017/10/2
Y1 - 2017/10/2
N2 - Construct: We aimed to develop an instrument to measure the quality of inpatient electronic health record– (EHR–) generated progress notes without requiring raters to review the detailed chart or know the patient. Background: Notes written in EHRs have generated criticism for being unnecessarily long and redundant, perpetuating inaccuracy and obscuring providers' clinical reasoning. Available assessment tools either focus on outpatient progress notes or require chart review by raters to develop familiarity with the patient. Approach: We used medical literature, local expert review, and attending focus groups to develop and refine an instrument to evaluate inpatient progress notes. We measured interrater reliability and scored the selected-response elements of the checklist for a sample of 100 progress notes written by PGY-1 trainees on the general medicine service. Results: We developed an instrument with 18 selected-response items and four open-ended items to measure the quality of inpatient progress notes written in the EHR. The mean Cohen's kappa coefficient demonstrated good agreement at.67. The mean note score was 66.9% of maximum possible points (SD = 10.6, range = 34.4%–93.3%). Conclusions: We present validity evidence in the domains of content, internal structure, and response process for a new checklist for rating inpatient progress notes. The scored checklist can be completed in approximately 7 minutes by a rater who is not familiar with the patient and can be done without extensive chart review. We further demonstrate that trainee notes show substantial room for improvement.
AB - Construct: We aimed to develop an instrument to measure the quality of inpatient electronic health record– (EHR–) generated progress notes without requiring raters to review the detailed chart or know the patient. Background: Notes written in EHRs have generated criticism for being unnecessarily long and redundant, perpetuating inaccuracy and obscuring providers' clinical reasoning. Available assessment tools either focus on outpatient progress notes or require chart review by raters to develop familiarity with the patient. Approach: We used medical literature, local expert review, and attending focus groups to develop and refine an instrument to evaluate inpatient progress notes. We measured interrater reliability and scored the selected-response elements of the checklist for a sample of 100 progress notes written by PGY-1 trainees on the general medicine service. Results: We developed an instrument with 18 selected-response items and four open-ended items to measure the quality of inpatient progress notes written in the EHR. The mean Cohen's kappa coefficient demonstrated good agreement at.67. The mean note score was 66.9% of maximum possible points (SD = 10.6, range = 34.4%–93.3%). Conclusions: We present validity evidence in the domains of content, internal structure, and response process for a new checklist for rating inpatient progress notes. The scored checklist can be completed in approximately 7 minutes by a rater who is not familiar with the patient and can be done without extensive chart review. We further demonstrate that trainee notes show substantial room for improvement.
KW - EHR
KW - assessment
KW - progress notes
KW - trainees
UR - http://www.scopus.com/inward/record.url?scp=85019203054&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85019203054&partnerID=8YFLogxK
U2 - 10.1080/10401334.2017.1303385
DO - 10.1080/10401334.2017.1303385
M3 - Article
C2 - 28497983
AN - SCOPUS:85019203054
SN - 1040-1334
VL - 29
SP - 420
EP - 432
JO - Teaching and Learning in Medicine
JF - Teaching and Learning in Medicine
IS - 4
ER -