Quantitative assessment of students' revision processes

Lisa R. Volpatti, Alex Jordan Hanson, Jennifer M. Schall, Jesse N. Dunietz, Amanda X. Chen, Rohan Chitnis, Eric J. Alm, Alison F. Takemura, Diana M. Chien

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Communication is a crucial skillset for engineers, yet graduates [1]-[3] and their employers [4]-[8] continue to report their lack of preparation for effective communication upon completion of their undergraduate or graduate programs. Thus, technical communication training merits deeper investigation and creative solutions. At the 2017 ASEE Meeting, we introduced the MIT School of Engineering Communication Lab, a discipline-specific technical communication service that is akin to a writing center, but embedded within engineering departments [9]. By using the expertise of graduate student and postdoctoral peer coaches within a given discipline, the Communication Lab provides a scalable, content-aware solution with the benefits of just-in-time, one-on-one [10], and peer [11] training. When we first introduced this model, we offered easy-to-record metrics for the Communication Lab's effectiveness (such as usage statistics and student and faculty opinion surveys), as are commonly used to assess writing centers [12], [13]. Here we present a formal quantitative study of the effectiveness of Communication Lab coaching. We designed a pre-post test study for two related tasks: personal statements for applications to graduate school and graduate fellowships. We designed an analytic rubric with seven categories (strategic alignment, audience awareness, context, evidence, organization/flow, language mechanics, and visual impact) and tested it to ensure inter-rater reliability. Over one semester, we collected and anonymized 119 personal statement drafts from 47 unique Communication Lab clients across four different engineering departments. Peer coaches rubric-scored the drafts, and we developed a statistical model based on maximum likelihood to identify significant score changes in individual rubric categories across trajectories of sequential drafts. In addition, post-session surveys of clients and their peer coaches provided insight into clients' qualitative experiences during coaching sessions. Taken together, our quantitative and qualitative findings suggest that our peer coaches are most effective in supporting the skills of organization/flow, strategic alignment, and providing appropriate evidence; this aligns with our program's emphasis on supporting high-level communication skills. Our results also suggest that a major factor in coaching efficacy is coach-client discussion of major takeaways from a session: rubric category scores were more likely to improve across a drafting trajectory when a category had been identified as a takeaway. Hence, we show quantitative evidence that through collaborative conversations, technical peer coaches can guide clients to identify and effectively revise key areas for improvement. Finally, since we have gathered a sizable dataset and developed analytical tools, we have laid the groundwork for future quantitative writing assessments by both our program and others. We argue that although inter-rater variability poses a challenge, statistical methods and skill-based assessments of authentic communication tasks can provide both insights into student writing/revision ability and direction for improvement of communication resources.

Original languageEnglish (US)
Article number1161
JournalASEE Annual Conference and Exposition, Conference Proceedings
Volume2020-June
StatePublished - Jun 22 2020
Event2020 ASEE Virtual Annual Conference, ASEE 2020 - Virtual, Online
Duration: Jun 22 2020Jun 26 2020

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Quantitative assessment of students' revision processes'. Together they form a unique fingerprint.

Cite this