TY - JOUR
T1 - The Revised METRIQ Score
T2 - A Quality Evaluation Tool for Online Educational Resources
AU - Colmers-Gray, Isabelle N.
AU - Krishnan, Keeth
AU - Chan, Teresa M.
AU - Seth Trueger, N.
AU - Paddock, Michael
AU - Grock, Andrew
AU - Zaver, Fareen
AU - Thoma, Brent
N1 - Publisher Copyright:
© 2019 by the Society for Academic Emergency Medicine
PY - 2019/10/1
Y1 - 2019/10/1
N2 - Background: With the rapid proliferation of online medical education resources, quality evaluation is increasingly critical. The Medical Education Translational Resources: Impact and Quality (METRIQ) study evaluated the METRIQ-8 quality assessment instrument for blogs and collected feedback to improve it. Methods: As part of the larger METRIQ study, participants rated the quality of five blog posts on clinical emergency medicine topics using the eight-item METRIQ-8 score. Next, participants used a 7-point Likert scale and free-text comments to evaluate the METRIQ-8 score on ease of use, clarity of items, and likelihood of recommending it to others. Descriptive statistics were calculated and comments were thematically analyzed to guide the development of a revised METRIQ (rMETRIQ) score. Results: A total of 309 emergency medicine attendings, residents, and medical students completed the survey. The majority of participants felt the METRIQ-8 score was easy to use (mean ± SD = 2.7 ± 1.1 out of 7, with 1 indicating strong agreement) and would recommend it to others (2.7 ± 1.3 out of 7, with 1 indicating strong agreement). The thematic analysis suggested clarifying ambiguous questions, shortening the 7-point scale, specifying scoring anchors for the questions, eliminating the “unsure” option, and grouping-related questions. This analysis guided changes that resulted in the rMETRIQ score. Conclusion: Feedback on the METRIQ-8 score contributed to the development of the rMETRIQ score, which has improved clarity and usability. Further validity evidence on the rMETRIQ score is required.
AB - Background: With the rapid proliferation of online medical education resources, quality evaluation is increasingly critical. The Medical Education Translational Resources: Impact and Quality (METRIQ) study evaluated the METRIQ-8 quality assessment instrument for blogs and collected feedback to improve it. Methods: As part of the larger METRIQ study, participants rated the quality of five blog posts on clinical emergency medicine topics using the eight-item METRIQ-8 score. Next, participants used a 7-point Likert scale and free-text comments to evaluate the METRIQ-8 score on ease of use, clarity of items, and likelihood of recommending it to others. Descriptive statistics were calculated and comments were thematically analyzed to guide the development of a revised METRIQ (rMETRIQ) score. Results: A total of 309 emergency medicine attendings, residents, and medical students completed the survey. The majority of participants felt the METRIQ-8 score was easy to use (mean ± SD = 2.7 ± 1.1 out of 7, with 1 indicating strong agreement) and would recommend it to others (2.7 ± 1.3 out of 7, with 1 indicating strong agreement). The thematic analysis suggested clarifying ambiguous questions, shortening the 7-point scale, specifying scoring anchors for the questions, eliminating the “unsure” option, and grouping-related questions. This analysis guided changes that resulted in the rMETRIQ score. Conclusion: Feedback on the METRIQ-8 score contributed to the development of the rMETRIQ score, which has improved clarity and usability. Further validity evidence on the rMETRIQ score is required.
UR - http://www.scopus.com/inward/record.url?scp=85085981642&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85085981642&partnerID=8YFLogxK
U2 - 10.1002/aet2.10376
DO - 10.1002/aet2.10376
M3 - Article
C2 - 31637356
AN - SCOPUS:85085981642
SN - 2472-5390
VL - 3
SP - 387
EP - 392
JO - AEM Education and Training
JF - AEM Education and Training
IS - 4
ER -