Evaluating Scientists: Citations, Impact Factor, h-Index
Identifying the key-performance parameters for active scientists has always remained a problematic issue. Evaluating and comparing researchers working in a given area have become a necessity since these competing scientists vie for the same limited resources, promotions, awards or fellowships of scientific academies. Whatever method we choose for evaluating the worth of a scientist’s individual research contribution, it should be simple, fair and transparent. A tall order indeed!
One common approach that has been used for a long time is to calculate the number of citations for the publications of a scientist and also see the impact factor of journals in which these publications have appeared. This approach, universally used as a decision-making tool, does have its limitations.
Citation Count
The number of citations for each publication of a scientist is readily available from different sources, e.g., Web of Science, Google Scholar and Scopus. It is generally believed that the impact of a researcher’s work is significant on a given field if his or her papers are frequently cited by other researchers. Usually self-citations are not included in such citation counts. However, using citation count alone to judge the quality of research contributions can be unfair to some researchers. It is quite likely that a researcher will have poor citation metrics (i) if he or she is working in a very narrow area (therefore with fewer citations)or (ii) if he or she is publishing mostly in a language other than English or mainly in books or book chapters (since most citation tools do not capture such citations).
Impact Factor
Publishing in a journal, such as Nature or Science, which has a high impact factor is considered very prestigious. In our profession, which deals with electronics and communications, it is a dream for many to publish in IEEE journals because some of the IEEE journals do have a high impact factor and their reviewing procedure is very tough. Impact factor is a measure of how frequently the papers published in a journal are cited in scientific literature. Impact factors are released each year in the Journal Citation Report by the Institute of Scientific Information (ISI). Since its first publication in 1972, the impact factors have acquired wide acceptability in the absence of any other metric to evaluate the worth of a journal.
However, there are limitations in using the impact factor as a measure of the quality of a journal, and hence the quality of research of a scientist who publishes in a high-impact factor journal. For example, many people may read and use the research findings appearing in a given paper but may not cite these because they do not publish their work. In other words, impact factor measures the usefulness of a journal to only those who read and cite the paper in their publications, leaving out a large number of other practitioners of the profession who have not published but yet benefited from the research findings of a paper published in that journal.
There are more than 100,000 journals published from around the world. However, ISI database includes only a small percentage of these journals. Therefore, if you publish in any journal which is not a part of the ISI database or if your papers are cited in the journals not listed in the ISI database, it will not add up to the impact factor calculation. Impact factors can also be manipulated. For example, in some journals, authors are forced in a subtle way to cite other papers published in the same journal. Therefore, blind usage of citation and impact factor indicators may not result in a correct evaluation of the scientific merit of a researcher.
The h-index
To overcome the problems associated with the citation metric and impact factor, in 2005, Jorge Hirsch of the University of California at San Diego suggested a simple method to quantify the impact of a scientist’s research output in a given area. The measure he suggested is called the h-index. In the last few years, it has quickly become a widely used measure of a researcher’s scientific output. Without getting into the mathematical rigor of this approach, the meaning of the h-index can be explained as follows. Suppose a researcher has 15 publications. If 10 of these publications are cited at least 10 times by other researchers, the h-index of the scientist is 10, indicating that the other 5 publications may have less than 10 citations. If one of these 10, out of the 15, publications receives, let us say, 100 citations, the h-index still remains 10. If each of these 15 papers receives 10 citations, the h-index is again only 10. The h-index will reach 15, only if each of all the 15 papers receives a minimum of 15 citations. Therefore, to calculate the h-index of a scientist, find the citations of each publication, rank them according to the number of citations received, and identify the first ‘h’ publications having at least ‘h’ citations. To have a reasonably good h-index it is not sufficient to have a few publications with hundreds of citations. The use of h-index aims at identifying researchers with more papers and relevant impact over a period of time.
Conclusion
Exuberant dependence on single numbers to quantify scientists’ contribution and make administrative decisions can affect their career progression or may force people to somehow enhance their h-index instead of focusing on their more legitimate activity, i.e., doing good science. Considering the complex issues associated with the calculation of scientific performance metrics, it is clear that a comprehensive approach should be used to evaluate the research worth of a scientist. We should not rely excessively on a single metric. Since the h-index is now becoming more popular and is simple to calculate, we should use it judiciously by combining it with other metrics discussed here.
Reference :
M. J. Kumar, “Evaluating Scientists: Citations, Impact Factor, h-Index, Online Page Hits and What Else?” IETE Technical Review <http://tr.ietejournals.org/text.asp?2009/26/3/165/50699> , Vol. 26, pp.165-168, 2009
Comments :