Using Anonymity and Communal Efforts to Improve Quality of Crowdsourced Feedback

Julie S. Hui, Amos Glenn, Rachel Jue, Elizabeth M Gerber, Steven P. Dow

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Student entrepreneurs struggle to collect feedback on their product pitches in a classroom setting due to a lack of time, money, and access to motivated feedback providers. Online social networks present a unique opportunity for entrepreneurial students to quickly access feedback providers by leveraging their online social capital. In order to better understand how to improve crowdsourced online pitch feedback, we perform an experiment to test the effect of online anonymity on pitch feedback quality and quantity. We also test a communal feedback method—evenly distributing between teams feedback providers from the class’s collective online social networks—which would help all teams benefit from a useful amount of feedback rather than having some teams receive much more feedback than others. We found that feedback providers in the anonymous condition provided significantly more specific criticism and specific praise, which students rated as more useful. Furthermore, we found that the communal feedback method helped all teams receive sufficient feedback to edit their pitches. This research contributes an empirical investigation to the crowdsourcing community of how crowds through online social networks can help student entrepreneurs obtain authentic feedback to improve their work.
Original languageEnglish (US)
Title of host publicationProceedings of the Third AAAI Conference on Human Computation and Crowdsourcing
EditorsElizabeth Gerber, Panos Ipeirotis
PublisherAAAI Press
Pages72-82
Number of pages11
ISBN (Print)978-1577357414
StatePublished - 2015

Fingerprint

Dive into the research topics of 'Using Anonymity and Communal Efforts to Improve Quality of Crowdsourced Feedback'. Together they form a unique fingerprint.

Cite this