Metaknowledge Network - Measuring Scientific Impact

Project: Research project

Project Details

Description

The explosive increase in the number of scientific journals and publications has outgrown researchers’ ability to evaluate them. To choose what to browse, read or cite from a huge and growing collection of scientific literature is a challenging task for researchers in nearly all areas of Science and Technology (S&T). In order to search for worthwhile publications, researchers are thus relying more and more on heuristic proxies such as the Journal Impact Factor or author reputation, which signal publication quality.
Indeed, since the introduction of the Science Citation Index (SCI) in 1961 and the establishment of bibliographic databases, the scientific community has relied increasingly on bibliometric measures for research impact evaluation. Various bibliometric indicators have been proposed as general measures of the research impact of individual researchers, journals and institutions, such as the total number of citations, the Journal Impact Factor and the h-index. However, extensive research on these measures has revealed that they can be inconsistent, biased and, worst of all, susceptible to manipulation. For example, the limitations of the popular h-index include its dependence on discipline and on career length. Another limitation of citation-based measures is that the process of citation accumulation can take a long time to converge, which confounds the interpretation of many measurements.
Our research will depart from previous efforts by attempting to develop a principled approach to the quantification of scientific research impact instead of proposing ad-hoc heuristic measures.
Aim 1. We will investigate the distribution of the ultimate number of accumulated citations to publications by a researcher and test whether the data is consistent with a discrete lognormal model.
To this end, we will use citation data acquired from Thomson Reuters’ Web of Science (WoS) pertaining to researchers working at top U.S. universities across seven disciplines: chemical engineering, chemistry, ecology, molecular biology, industrial engineering, material science and psychology.
Aim 2. We will investigate the distribution of the ultimate number of accumulated citations to publications by all researchers affiliated with a given department and test whether the data is consistent with a discrete lognormal model.
We will also use citation data acquired from Thomson Reuters’ Web of Science (WoS), but will expand the number of disciplines and the location of the Universities in which the departments reside.
Aim 3. We will develop a measure of scientific impact with desirable properties using the parameters characterizing the distribution of ultimate number of citations.
StatusFinished
Effective start/end date5/1/153/31/16

Funding

  • University of Chicago (Agreement 5/26/2015)
  • John Templeton Foundation (Agreement 5/26/2015)

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.