Arnold & Fowler on “Nefarious Numbers” about the impact factor manipulation

“Goodhart’s law warns us that “when a measure becomes a target, it ceases to be a good measure.” The impact factor has moved in recent years from an obscure bibliometric indicator to become the chief quantitative measure of the quality of a journal, its research papers, the researchers who wrote those papers and even the institution they work in. The impact factor for a journal in a given year is calculated by ISI (Thomson Reuters) as the average number of citations in that year to the articles the journal published in the preceding two years. It is widely used by researchers deciding where to publish and what to read, by tenure and promotion committees laboring under the assumption that publication in a higher impact-factor journal represents better work. However, it has been widely criticized on a variety of grounds (it does not determine paper’s quality, it is a crude and flawed statistic, etc.). Impact factor manipulation can take numerous forms. Let us follow Douglas N. Arnold and Kristine K. Fowler, “Nefarious Numbers,” Notices of the AMS 58: 434-437, Mach 2011 [ArXiv, 1 Oct 2010].

Editors can manipulate the impact factor by means of the following practices: (1) “canny editors cultivate a cadre of regulars who can be relied upon to boost the measured quality of the journal by citing themselves and each other shamelessly;” (2) “authors of manuscripts under review often were asked or required by editors to cite other papers from the journal; this practice borders on extortion, even when posed as a suggestion;” and (3) “editors raise their journals’ impact factors is by publishing review items with large numbers of citations to the journal.” “These unscientific practices wreak upon the scientific literature have raised occasional alarms. A counterexample should confirm the need for alarm.”

“The International Journal of Nonlinear Sciences and Numerical Simulation (IJNSNS) has dominated the impact-factor charts in the “Mathematics, Applied” category. It took first place in each year 2006, 2007, 2008, and 2009, generally by a wide margin, and came in second in 2005. However, IJNSNS is nowhere near the top of its field. Thus we set out to understand the origin of its large impact factor. In 2008 IJNSNS had an impact factor of 8.91 in ISI’s Journal Citation Reports (JCR). The second and third highest impact factors, Communications on Pure and Applied Mathematics (CPAM) and SIAM Review (SIREV), only have impact factors of 3.69 and 2.80, respectively, in 2008. Both journals have a reputation for excellence.”

“Evaluation based on expert judgment is the best alternative to citation-based measures for journals. The Australian Research Council recently released such an evaluation, listing quality ratings for over 20,000 peer reviewed journals across disciplines. The list was developed through an extensive review process involving learned academies (such as the Australian Academy of Science), disciplinary bodies (such as the Australian Mathematical Society), and many researchers and expert reviewers. The assigned quality rating, which is intended to represent “the overall quality of the journal,” is one of four values: A* for one of the best in its field or subfield, A for very high quality, B for solid, though not outstanding, reputation, and C for those does not meet the criteria of the higher tiers.

“The ERA list included all but five of the 175 journals assigned a 2008 impact factor by JCR in the category “Mathematics, Applied.” The figure shows the impact factors for journals in each of the four rating tiers. There are many examples of journals with a higher impact factor than other journals that are one, two, and even three rating tiers higher. The red line is drawn so that 20% of the A* journals are below it; it is notable that 51% of the A journals have an impact factor above that level, as do 23% of the B journals and even 17% of those in the C category. The most extreme outlier is IJNSNS, which, despite its relatively astronomical impact factor, is not in the first or second but, rather, third tier. The ERA rating assigned its highest score, A*, to 25 journals. Most of the journals with the highest impact factors are here, including CPAM and SIREV, but, of the top 10 journals by impact factor, two were assigned an A, and only IJNSNS was assigned a B. There were 53 A-rated journals and 69 B-rated journals altogether. In short, the ERA ratings suggest that IJNSNS is not only not the top applied math journal but also that its rank should be somewhere in the range 75–150. This remarkable mismatch between reputation and impact factor needs an explanation.

The making of the high impact factor of IJNSNS. “The top-citing author to IJNSNS in 2008 was the journal’s editor-in-chief, Ji-Huan He, who cited the journal (within the two-year window) 243 times. The second top citer, D. D. Ganji, with 114 cites, is also a member of the editorial board, as is the third, regional editor Mohamed El Naschie, with 58 cites. Together these three account for 29% of the citations counted toward the impact factor. For comparison, the top three citers to SIREV contributed only 7, 4, and 4 citations, respectively, accounting for less than 12% of the counted citations, and none of these authors is involved in editing the journal. For CPAM the top three citers (9, 8, and 8) contributed about 7% of the citations and, again, were not on the editorial board.”

“Citations to IJNSNS are concentrated within the two-year window used in the impact factor calculation. The 2008 citations to articles published since 2000 shows that 16% of the citations to CPAM fell within that two-year window and only 8% of those to SIREV did; in contrast, 71.5% of the 2008 citations to IJNSNS fell within the two-year window. A single issue of the Journal of Physics: Conference Series provided the greatest number of IJNSNS citations, 294 citations accounting for more than 20% of its impact factor. This issue was the proceedings of a conference organized by IJNSNS editor-in-chief He at his home university. He was responsible for the peer review of the issue. The second top-citing journal for IJNSNS was Topological Methods in Nonlinear Analysis, which contributed 206 citations (14%), again with all citations coming from a single issue. This was a special issue with Ji-Huan He as the guest editor; his co-editor, Lan Xu, is also on the IJNSNS editorial board. J.-H. Continuing down the list of IJNSNS high-citing journals, another similar circumstance comes to light: 50 citations from a single issue of the Journal of Polymer Engineering (which, like IJNSNS, is published by Freund), guest edited by the same pair, Ji-Huan He and Lan Xu. However, third place is held by the journal Chaos, Solitons and Fractals, with 154 citations spread over numerous issues. In 2008 Ji-Huan He served on the editorial board of CS&F, and its editor-in-chief was Mohamed El Naschie, who was also a coeditor of IJNSNS. In a highly publicized case, the entire editorial board of CS&F was recently replaced, but El Naschie remained coeditor of IJNSNS.

Bibliometrics are also used to evaluate individuals, articles, institutions, and even nations. Essential Science Indicators, which is produced by Thomson Reuters, is promoted as a tool for ranking “top countries, journals, scientists, papers, and institutions by field of research”. The special issue of Journal of Physics: Conference Series that He edited and that garnered 243 citations for his journal also garnered 353 citations to He himself. He claims a total citation count of over 6,800. ScienceWatch.com notices that “according to a recent analysis of Essential Science Indicators from Thomson Scientific, Professor Ji-Huan He has been named a Rising Star in the field of Computer Science… His citation record in the Web of Science includes 137 papers cited a total of 3,193 times to date.” He was cited by ESI for the “Hottest Research of 2007–8” and again for the “Hottest Research of 2009”. The h-index is another popular citation-based metric for researchers, intended to measure productivity as well as impact. An individual’s h-index is the largest number such that many of his or her papers have been cited at least that many times. It too is not immune from Goodhart’s law. J.-H. He claims an h-index of 39, while Hirsch estimated the median for Nobel prize winners in physics to be 35.

The cumulative result of the design flaws and manipulation is that impact factor gives a very inaccurate view of journal quality. Scientists who give in to the temptation to suppress data or fiddle with statistics to draw a clearer point are censured. We must bring a similar level of integrity to the evaluation of research products.  Administrators, funding agencies, librarians, and others needing such evaluations should just say no to simplistic solutions and approach important decisions with thoughtfulness, wisdom, and expertise.”

This entry was posted in Bibliometry, Mathematics, News, Science, Science Policy and tagged , , , . Bookmark the permalink.

Leave a comment