Identifying New Product Ideas: Waiting for the Wisdom of the Crowd or Screening Ideas in Real Time

Steven Hoornaert, Michel Ballings, Edward C. Malthouse, Dirk Van den Poel

Research output: Research - peer-reviewArticle

Abstract

Crowdsourcing ideas from consumers can enrich idea input in new product development. After a decade of initiatives (e.g., Starbucks' MyStarbucksIdea, Dell's IdeaStorm), the implications of crowdsourcing for idea generation are well understood, but challenges remain in dealing with the large volume of rapidly generated ideas produced in crowdsourcing communities. This study proposes a model that can assist managers in efficiently processing crowdsourced ideas by identifying the aspects of ideas that are most predictive of future implementation and identifies three sources of information available for an idea: its content, the contributor proposing it, and the crowd's feedback on the idea (the "3Cs"). These information sources differ in their time of availability (content/contributor information is available immediately; crowd feedback accumulates over time) and in the extent to which they comprise structured or unstructured data. This study draws from prior research to operationalize variables corresponding to the 3Cs and develops a new measure to quantify an idea's distinctiveness. Applying automated information retrieval methods (latent semantic indexing) and testing several linear methods (linear discriminant analysis, regularized logistic regression) and nonlinear machine-learning algorithms (stochastic adaptive boosting, random forests), this article identifies the variables that are most useful towards predicting idea implementation in a crowdsourcing community for an IT product (Mendeley). Our results indicate that consideration of content and contributor information improves ranking performance between 22.6 and 26.0% over random idea selection, and that adding crowd-related information further improves performance by up to 48.1%. Crowd feedback is the best predictor of idea implementation, followed by idea content and distinctiveness, and the contributor's past idea-generation experience. Firms are advised to implement two idea selection support systems: one to rank new ideas in real time based on content and contributor experience, and another that integrates the crowd's idea evaluation after it has had sufficient time to provide feedback.

LanguageEnglish
JournalJournal of Product Innovation Management
DOIs
StateAccepted/In press - 2017

Fingerprint

Screening
Feedback
New products
Wisdom
Adaptive boosting
Discriminant analysis
Information retrieval
Product development
Learning algorithms
Learning systems
Logistics
Managers
Semantics
Availability
Testing
Processing
Distinctiveness
Idea generation
Information sources
Learning algorithm

Keywords

  • Crowdsourcing
  • Idea selection
  • Idea selection support system
  • Innovation
  • Machine learning
  • Real-time analysis

ASJC Scopus subject areas

  • Engineering(all)
  • Strategy and Management
  • Management of Technology and Innovation

Cite this

Identifying New Product Ideas : Waiting for the Wisdom of the Crowd or Screening Ideas in Real Time. / Hoornaert, Steven; Ballings, Michel; Malthouse, Edward C.; Van den Poel, Dirk.

In: Journal of Product Innovation Management, 2017.

Research output: Research - peer-reviewArticle

@article{7adc8792685943aab0ff45f0f3a49542,
title = "Identifying New Product Ideas: Waiting for the Wisdom of the Crowd or Screening Ideas in Real Time",
abstract = "Crowdsourcing ideas from consumers can enrich idea input in new product development. After a decade of initiatives (e.g., Starbucks' MyStarbucksIdea, Dell's IdeaStorm), the implications of crowdsourcing for idea generation are well understood, but challenges remain in dealing with the large volume of rapidly generated ideas produced in crowdsourcing communities. This study proposes a model that can assist managers in efficiently processing crowdsourced ideas by identifying the aspects of ideas that are most predictive of future implementation and identifies three sources of information available for an idea: its content, the contributor proposing it, and the crowd's feedback on the idea (the {"}3Cs{"}). These information sources differ in their time of availability (content/contributor information is available immediately; crowd feedback accumulates over time) and in the extent to which they comprise structured or unstructured data. This study draws from prior research to operationalize variables corresponding to the 3Cs and develops a new measure to quantify an idea's distinctiveness. Applying automated information retrieval methods (latent semantic indexing) and testing several linear methods (linear discriminant analysis, regularized logistic regression) and nonlinear machine-learning algorithms (stochastic adaptive boosting, random forests), this article identifies the variables that are most useful towards predicting idea implementation in a crowdsourcing community for an IT product (Mendeley). Our results indicate that consideration of content and contributor information improves ranking performance between 22.6 and 26.0% over random idea selection, and that adding crowd-related information further improves performance by up to 48.1%. Crowd feedback is the best predictor of idea implementation, followed by idea content and distinctiveness, and the contributor's past idea-generation experience. Firms are advised to implement two idea selection support systems: one to rank new ideas in real time based on content and contributor experience, and another that integrates the crowd's idea evaluation after it has had sufficient time to provide feedback.",
keywords = "Crowdsourcing, Idea selection, Idea selection support system, Innovation, Machine learning, Real-time analysis",
author = "Steven Hoornaert and Michel Ballings and Malthouse, {Edward C.} and {Van den Poel}, Dirk",
year = "2017",
doi = "10.1111/jpim.12396",
journal = "Journal of Product Innovation Management",
issn = "0737-6782",
publisher = "Wiley-Blackwell",

}

TY - JOUR

T1 - Identifying New Product Ideas

T2 - Journal of Product Innovation Management

AU - Hoornaert,Steven

AU - Ballings,Michel

AU - Malthouse,Edward C.

AU - Van den Poel,Dirk

PY - 2017

Y1 - 2017

N2 - Crowdsourcing ideas from consumers can enrich idea input in new product development. After a decade of initiatives (e.g., Starbucks' MyStarbucksIdea, Dell's IdeaStorm), the implications of crowdsourcing for idea generation are well understood, but challenges remain in dealing with the large volume of rapidly generated ideas produced in crowdsourcing communities. This study proposes a model that can assist managers in efficiently processing crowdsourced ideas by identifying the aspects of ideas that are most predictive of future implementation and identifies three sources of information available for an idea: its content, the contributor proposing it, and the crowd's feedback on the idea (the "3Cs"). These information sources differ in their time of availability (content/contributor information is available immediately; crowd feedback accumulates over time) and in the extent to which they comprise structured or unstructured data. This study draws from prior research to operationalize variables corresponding to the 3Cs and develops a new measure to quantify an idea's distinctiveness. Applying automated information retrieval methods (latent semantic indexing) and testing several linear methods (linear discriminant analysis, regularized logistic regression) and nonlinear machine-learning algorithms (stochastic adaptive boosting, random forests), this article identifies the variables that are most useful towards predicting idea implementation in a crowdsourcing community for an IT product (Mendeley). Our results indicate that consideration of content and contributor information improves ranking performance between 22.6 and 26.0% over random idea selection, and that adding crowd-related information further improves performance by up to 48.1%. Crowd feedback is the best predictor of idea implementation, followed by idea content and distinctiveness, and the contributor's past idea-generation experience. Firms are advised to implement two idea selection support systems: one to rank new ideas in real time based on content and contributor experience, and another that integrates the crowd's idea evaluation after it has had sufficient time to provide feedback.

AB - Crowdsourcing ideas from consumers can enrich idea input in new product development. After a decade of initiatives (e.g., Starbucks' MyStarbucksIdea, Dell's IdeaStorm), the implications of crowdsourcing for idea generation are well understood, but challenges remain in dealing with the large volume of rapidly generated ideas produced in crowdsourcing communities. This study proposes a model that can assist managers in efficiently processing crowdsourced ideas by identifying the aspects of ideas that are most predictive of future implementation and identifies three sources of information available for an idea: its content, the contributor proposing it, and the crowd's feedback on the idea (the "3Cs"). These information sources differ in their time of availability (content/contributor information is available immediately; crowd feedback accumulates over time) and in the extent to which they comprise structured or unstructured data. This study draws from prior research to operationalize variables corresponding to the 3Cs and develops a new measure to quantify an idea's distinctiveness. Applying automated information retrieval methods (latent semantic indexing) and testing several linear methods (linear discriminant analysis, regularized logistic regression) and nonlinear machine-learning algorithms (stochastic adaptive boosting, random forests), this article identifies the variables that are most useful towards predicting idea implementation in a crowdsourcing community for an IT product (Mendeley). Our results indicate that consideration of content and contributor information improves ranking performance between 22.6 and 26.0% over random idea selection, and that adding crowd-related information further improves performance by up to 48.1%. Crowd feedback is the best predictor of idea implementation, followed by idea content and distinctiveness, and the contributor's past idea-generation experience. Firms are advised to implement two idea selection support systems: one to rank new ideas in real time based on content and contributor experience, and another that integrates the crowd's idea evaluation after it has had sufficient time to provide feedback.

KW - Crowdsourcing

KW - Idea selection

KW - Idea selection support system

KW - Innovation

KW - Machine learning

KW - Real-time analysis

UR - http://www.scopus.com/inward/record.url?scp=85021229346&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85021229346&partnerID=8YFLogxK

U2 - 10.1111/jpim.12396

DO - 10.1111/jpim.12396

M3 - Article

JO - Journal of Product Innovation Management

JF - Journal of Product Innovation Management

SN - 0737-6782

ER -