TY - JOUR
T1 - Use of the Rand Structured Implicit Review Instrument for quality of care assessment
AU - Lefevre, F.
AU - Feinglass, J.
AU - Yarnold, P. R.
AU - Martin, G. J.
AU - Webster, J.
N1 - Funding Information:
From *the Division of General Internal Medicine, tthe Center for Health Services and Policy Research, and §the Buehler Center on Aging, Northwestern University Medical Schoo~ Chicago, Illinois, and :j:the Department of Psychology, University of Illinois at Chicago. Supported in part by a grant from the Northwestern Memorial Foundation, Chicago, Illinois. Presented in part at the Midwestern Meeting of the Society of Gen· eral Internal Medicine, Chicago, IL, November, 1991, and published in abstract form in Clinical Research 39:746A, 1991. Correspondence: Frank Lefevre, MD, Division of General Internal Medicine, Northwestern University Medical School, 750 North Lake Shore Drive, Suite 626, Chicago, IL 60611.
PY - 1993
Y1 - 1993
N2 - The Rand Structured Implicit Review Instrument is a 27-item instrument that rates process quality of care for patients with five common illnesses. This study reports on the use of this instrument for hospitalized patients with long lengths of stay. A total of 120 medical records were reviewed by multiple physician reviewers for patients discharged with congestive heart failure, acute myocardial infarction, and pneumonia. Mean inter-rater reliability was assessed for a subsample of six records by kappa score. A multiple regression analysis was used to estimate the relationship between process ratings for the quality of documentation, assessment, monitoring, and therapy and overall quality of care scores, controlled for physician judgments about patients' prognosis and selected patient characteristics. Each reviewer also evaluated the instrument. Mean kappa for trichotomized ratings of quality of care was 0.50. The majority of all quality of care ratings were in the good or very good range (77.5%). The full regression model, including process subscale quality ratings, prognostic items, and patient characteristics, accounted for 38% of the total variance in the quality of care ratings. Items measuring the quality of assessment (p < 0.0001), therapy (p < 0.02) and monitoring (p < 0.01) were significant. Physicians accepted the use of such a form moderately well. The Rand quality of care form shows consistency in rating overall quality of care and individual dimensions of quality. Achieving a high level of inter-rater reliability is difficult with implicit review. By focusing on specific areas of potentially deficient care, structured review instruments can improve clinical quality improvement efforts.
AB - The Rand Structured Implicit Review Instrument is a 27-item instrument that rates process quality of care for patients with five common illnesses. This study reports on the use of this instrument for hospitalized patients with long lengths of stay. A total of 120 medical records were reviewed by multiple physician reviewers for patients discharged with congestive heart failure, acute myocardial infarction, and pneumonia. Mean inter-rater reliability was assessed for a subsample of six records by kappa score. A multiple regression analysis was used to estimate the relationship between process ratings for the quality of documentation, assessment, monitoring, and therapy and overall quality of care scores, controlled for physician judgments about patients' prognosis and selected patient characteristics. Each reviewer also evaluated the instrument. Mean kappa for trichotomized ratings of quality of care was 0.50. The majority of all quality of care ratings were in the good or very good range (77.5%). The full regression model, including process subscale quality ratings, prognostic items, and patient characteristics, accounted for 38% of the total variance in the quality of care ratings. Items measuring the quality of assessment (p < 0.0001), therapy (p < 0.02) and monitoring (p < 0.01) were significant. Physicians accepted the use of such a form moderately well. The Rand quality of care form shows consistency in rating overall quality of care and individual dimensions of quality. Achieving a high level of inter-rater reliability is difficult with implicit review. By focusing on specific areas of potentially deficient care, structured review instruments can improve clinical quality improvement efforts.
KW - Implicit review
KW - Inter-rater reliability
KW - Outcome bias
KW - Peer review
KW - Quality of care
UR - http://www.scopus.com/inward/record.url?scp=0027243460&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0027243460&partnerID=8YFLogxK
U2 - 10.1097/00000441-199304000-00005
DO - 10.1097/00000441-199304000-00005
M3 - Article
C2 - 8475947
AN - SCOPUS:0027243460
VL - 305
SP - 222
EP - 228
JO - American Journal of the Medical Sciences
JF - American Journal of the Medical Sciences
SN - 0002-9629
IS - 4
ER -