Promoting Responsible Electronic Documentation

Validity Evidence for a Checklist to Assess Progress Notes in the Electronic Health Record

Jennifer A Bierman*, Kathryn Kinner Hufmeyer, David Liss, Charlotta Weaver, Heather L Heiman

*Corresponding author for this work

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

Construct: We aimed to develop an instrument to measure the quality of inpatient electronic health record– (EHR–) generated progress notes without requiring raters to review the detailed chart or know the patient. Background: Notes written in EHRs have generated criticism for being unnecessarily long and redundant, perpetuating inaccuracy and obscuring providers' clinical reasoning. Available assessment tools either focus on outpatient progress notes or require chart review by raters to develop familiarity with the patient. Approach: We used medical literature, local expert review, and attending focus groups to develop and refine an instrument to evaluate inpatient progress notes. We measured interrater reliability and scored the selected-response elements of the checklist for a sample of 100 progress notes written by PGY-1 trainees on the general medicine service. Results: We developed an instrument with 18 selected-response items and four open-ended items to measure the quality of inpatient progress notes written in the EHR. The mean Cohen's kappa coefficient demonstrated good agreement at.67. The mean note score was 66.9% of maximum possible points (SD = 10.6, range = 34.4%–93.3%). Conclusions: We present validity evidence in the domains of content, internal structure, and response process for a new checklist for rating inpatient progress notes. The scored checklist can be completed in approximately 7 minutes by a rater who is not familiar with the patient and can be done without extensive chart review. We further demonstrate that trainee notes show substantial room for improvement.

Original languageEnglish (US)
Pages (from-to)420-432
Number of pages13
JournalTeaching and Learning in Medicine
Volume29
Issue number4
DOIs
StatePublished - Oct 2 2017

Fingerprint

documentation
electronics
trainee
health
general medicine
evidence
criticism
rating
expert
Group
literature

Keywords

  • EHR
  • assessment
  • progress notes
  • trainees

ASJC Scopus subject areas

  • Education

Cite this

@article{d4e57844480941f38ec7df49b77b5132,
title = "Promoting Responsible Electronic Documentation: Validity Evidence for a Checklist to Assess Progress Notes in the Electronic Health Record",
abstract = "Construct: We aimed to develop an instrument to measure the quality of inpatient electronic health record– (EHR–) generated progress notes without requiring raters to review the detailed chart or know the patient. Background: Notes written in EHRs have generated criticism for being unnecessarily long and redundant, perpetuating inaccuracy and obscuring providers' clinical reasoning. Available assessment tools either focus on outpatient progress notes or require chart review by raters to develop familiarity with the patient. Approach: We used medical literature, local expert review, and attending focus groups to develop and refine an instrument to evaluate inpatient progress notes. We measured interrater reliability and scored the selected-response elements of the checklist for a sample of 100 progress notes written by PGY-1 trainees on the general medicine service. Results: We developed an instrument with 18 selected-response items and four open-ended items to measure the quality of inpatient progress notes written in the EHR. The mean Cohen's kappa coefficient demonstrated good agreement at.67. The mean note score was 66.9{\%} of maximum possible points (SD = 10.6, range = 34.4{\%}–93.3{\%}). Conclusions: We present validity evidence in the domains of content, internal structure, and response process for a new checklist for rating inpatient progress notes. The scored checklist can be completed in approximately 7 minutes by a rater who is not familiar with the patient and can be done without extensive chart review. We further demonstrate that trainee notes show substantial room for improvement.",
keywords = "EHR, assessment, progress notes, trainees",
author = "Bierman, {Jennifer A} and Hufmeyer, {Kathryn Kinner} and David Liss and Charlotta Weaver and Heiman, {Heather L}",
year = "2017",
month = "10",
day = "2",
doi = "10.1080/10401334.2017.1303385",
language = "English (US)",
volume = "29",
pages = "420--432",
journal = "Teaching and Learning in Medicine",
issn = "1040-1334",
publisher = "Routledge",
number = "4",

}

TY - JOUR

T1 - Promoting Responsible Electronic Documentation

T2 - Validity Evidence for a Checklist to Assess Progress Notes in the Electronic Health Record

AU - Bierman, Jennifer A

AU - Hufmeyer, Kathryn Kinner

AU - Liss, David

AU - Weaver, Charlotta

AU - Heiman, Heather L

PY - 2017/10/2

Y1 - 2017/10/2

N2 - Construct: We aimed to develop an instrument to measure the quality of inpatient electronic health record– (EHR–) generated progress notes without requiring raters to review the detailed chart or know the patient. Background: Notes written in EHRs have generated criticism for being unnecessarily long and redundant, perpetuating inaccuracy and obscuring providers' clinical reasoning. Available assessment tools either focus on outpatient progress notes or require chart review by raters to develop familiarity with the patient. Approach: We used medical literature, local expert review, and attending focus groups to develop and refine an instrument to evaluate inpatient progress notes. We measured interrater reliability and scored the selected-response elements of the checklist for a sample of 100 progress notes written by PGY-1 trainees on the general medicine service. Results: We developed an instrument with 18 selected-response items and four open-ended items to measure the quality of inpatient progress notes written in the EHR. The mean Cohen's kappa coefficient demonstrated good agreement at.67. The mean note score was 66.9% of maximum possible points (SD = 10.6, range = 34.4%–93.3%). Conclusions: We present validity evidence in the domains of content, internal structure, and response process for a new checklist for rating inpatient progress notes. The scored checklist can be completed in approximately 7 minutes by a rater who is not familiar with the patient and can be done without extensive chart review. We further demonstrate that trainee notes show substantial room for improvement.

AB - Construct: We aimed to develop an instrument to measure the quality of inpatient electronic health record– (EHR–) generated progress notes without requiring raters to review the detailed chart or know the patient. Background: Notes written in EHRs have generated criticism for being unnecessarily long and redundant, perpetuating inaccuracy and obscuring providers' clinical reasoning. Available assessment tools either focus on outpatient progress notes or require chart review by raters to develop familiarity with the patient. Approach: We used medical literature, local expert review, and attending focus groups to develop and refine an instrument to evaluate inpatient progress notes. We measured interrater reliability and scored the selected-response elements of the checklist for a sample of 100 progress notes written by PGY-1 trainees on the general medicine service. Results: We developed an instrument with 18 selected-response items and four open-ended items to measure the quality of inpatient progress notes written in the EHR. The mean Cohen's kappa coefficient demonstrated good agreement at.67. The mean note score was 66.9% of maximum possible points (SD = 10.6, range = 34.4%–93.3%). Conclusions: We present validity evidence in the domains of content, internal structure, and response process for a new checklist for rating inpatient progress notes. The scored checklist can be completed in approximately 7 minutes by a rater who is not familiar with the patient and can be done without extensive chart review. We further demonstrate that trainee notes show substantial room for improvement.

KW - EHR

KW - assessment

KW - progress notes

KW - trainees

UR - http://www.scopus.com/inward/record.url?scp=85019203054&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85019203054&partnerID=8YFLogxK

U2 - 10.1080/10401334.2017.1303385

DO - 10.1080/10401334.2017.1303385

M3 - Article

VL - 29

SP - 420

EP - 432

JO - Teaching and Learning in Medicine

JF - Teaching and Learning in Medicine

SN - 1040-1334

IS - 4

ER -