TY - JOUR
T1 - Quantitative assessment of students' revision processes
AU - Volpatti, Lisa R.
AU - Hanson, Alex Jordan
AU - Schall, Jennifer M.
AU - Dunietz, Jesse N.
AU - Chen, Amanda X.
AU - Chitnis, Rohan
AU - Alm, Eric J.
AU - Takemura, Alison F.
AU - Chien, Diana M.
N1 - Publisher Copyright:
© American Society for Engineering Education 2020.
PY - 2020/6/22
Y1 - 2020/6/22
N2 - Communication is a crucial skillset for engineers, yet graduates [1]-[3] and their employers [4]-[8] continue to report their lack of preparation for effective communication upon completion of their undergraduate or graduate programs. Thus, technical communication training merits deeper investigation and creative solutions. At the 2017 ASEE Meeting, we introduced the MIT School of Engineering Communication Lab, a discipline-specific technical communication service that is akin to a writing center, but embedded within engineering departments [9]. By using the expertise of graduate student and postdoctoral peer coaches within a given discipline, the Communication Lab provides a scalable, content-aware solution with the benefits of just-in-time, one-on-one [10], and peer [11] training. When we first introduced this model, we offered easy-to-record metrics for the Communication Lab's effectiveness (such as usage statistics and student and faculty opinion surveys), as are commonly used to assess writing centers [12], [13]. Here we present a formal quantitative study of the effectiveness of Communication Lab coaching. We designed a pre-post test study for two related tasks: personal statements for applications to graduate school and graduate fellowships. We designed an analytic rubric with seven categories (strategic alignment, audience awareness, context, evidence, organization/flow, language mechanics, and visual impact) and tested it to ensure inter-rater reliability. Over one semester, we collected and anonymized 119 personal statement drafts from 47 unique Communication Lab clients across four different engineering departments. Peer coaches rubric-scored the drafts, and we developed a statistical model based on maximum likelihood to identify significant score changes in individual rubric categories across trajectories of sequential drafts. In addition, post-session surveys of clients and their peer coaches provided insight into clients' qualitative experiences during coaching sessions. Taken together, our quantitative and qualitative findings suggest that our peer coaches are most effective in supporting the skills of organization/flow, strategic alignment, and providing appropriate evidence; this aligns with our program's emphasis on supporting high-level communication skills. Our results also suggest that a major factor in coaching efficacy is coach-client discussion of major takeaways from a session: rubric category scores were more likely to improve across a drafting trajectory when a category had been identified as a takeaway. Hence, we show quantitative evidence that through collaborative conversations, technical peer coaches can guide clients to identify and effectively revise key areas for improvement. Finally, since we have gathered a sizable dataset and developed analytical tools, we have laid the groundwork for future quantitative writing assessments by both our program and others. We argue that although inter-rater variability poses a challenge, statistical methods and skill-based assessments of authentic communication tasks can provide both insights into student writing/revision ability and direction for improvement of communication resources.
AB - Communication is a crucial skillset for engineers, yet graduates [1]-[3] and their employers [4]-[8] continue to report their lack of preparation for effective communication upon completion of their undergraduate or graduate programs. Thus, technical communication training merits deeper investigation and creative solutions. At the 2017 ASEE Meeting, we introduced the MIT School of Engineering Communication Lab, a discipline-specific technical communication service that is akin to a writing center, but embedded within engineering departments [9]. By using the expertise of graduate student and postdoctoral peer coaches within a given discipline, the Communication Lab provides a scalable, content-aware solution with the benefits of just-in-time, one-on-one [10], and peer [11] training. When we first introduced this model, we offered easy-to-record metrics for the Communication Lab's effectiveness (such as usage statistics and student and faculty opinion surveys), as are commonly used to assess writing centers [12], [13]. Here we present a formal quantitative study of the effectiveness of Communication Lab coaching. We designed a pre-post test study for two related tasks: personal statements for applications to graduate school and graduate fellowships. We designed an analytic rubric with seven categories (strategic alignment, audience awareness, context, evidence, organization/flow, language mechanics, and visual impact) and tested it to ensure inter-rater reliability. Over one semester, we collected and anonymized 119 personal statement drafts from 47 unique Communication Lab clients across four different engineering departments. Peer coaches rubric-scored the drafts, and we developed a statistical model based on maximum likelihood to identify significant score changes in individual rubric categories across trajectories of sequential drafts. In addition, post-session surveys of clients and their peer coaches provided insight into clients' qualitative experiences during coaching sessions. Taken together, our quantitative and qualitative findings suggest that our peer coaches are most effective in supporting the skills of organization/flow, strategic alignment, and providing appropriate evidence; this aligns with our program's emphasis on supporting high-level communication skills. Our results also suggest that a major factor in coaching efficacy is coach-client discussion of major takeaways from a session: rubric category scores were more likely to improve across a drafting trajectory when a category had been identified as a takeaway. Hence, we show quantitative evidence that through collaborative conversations, technical peer coaches can guide clients to identify and effectively revise key areas for improvement. Finally, since we have gathered a sizable dataset and developed analytical tools, we have laid the groundwork for future quantitative writing assessments by both our program and others. We argue that although inter-rater variability poses a challenge, statistical methods and skill-based assessments of authentic communication tasks can provide both insights into student writing/revision ability and direction for improvement of communication resources.
UR - http://www.scopus.com/inward/record.url?scp=85095767702&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85095767702&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85095767702
SN - 2153-5965
VL - 2020-June
JO - ASEE Annual Conference and Exposition, Conference Proceedings
JF - ASEE Annual Conference and Exposition, Conference Proceedings
M1 - 1161
T2 - 2020 ASEE Virtual Annual Conference, ASEE 2020
Y2 - 22 June 2020 through 26 June 2020
ER -