O'Really?

July 24, 2009

Escape from the impact factor: The Great Escape?

The Great Escape with Steve McQueenQuite by chance, I stumbled on this interesting paper [1] yesterday by Philip Campbell who is the Editor-in-Chief of the scientific über-journal Nature [2]. Here is the abstract:

As Editor-in-Chief of the journal Nature, I am concerned by the tendency within academic administrations to focus on a journal’s impact factor when judging the worth of scientific contributions by researchers, affecting promotions, recruitment and, in some countries, financial bonuses for each paper. Our own internal research demonstrates how a high journal impact factor can be the skewed result of many citations of a few papers rather than the average level of the majority, reducing its value as an objective measure of an individual paper. Proposed alternative indices have their own drawbacks. Many researchers say that their important work has been published in low-impact journals. Focusing on the citations of individual papers is a more reliable indicator of an individual’s impact. A positive development is the increasing ability to track the contributions of individuals by means of author-contribution statements and perhaps, in the future, citability of components of papers rather than the whole. There are attempts to escape the hierarchy of high-impact-factor journals by means of undifferentiated databases of peer-reviewed papers such as PLoS One. It remains to be seen whether that model will help outstanding work to rise to due recognition regardless of editorial selectivity. Although the current system may be effective at measuring merit on national and institutional scales, the most effective and fair analysis of a person’s contribution derives from a direct assessment of individual papers, regardless of where they were published.

It’s well worth reading the views of the editor of an important closed-access journal like Nature, a world champion heavyweight of Impact Factor Boxing. So their view on article-level bibliometrics and novel models of scientific publishing on the Web like PLoS ONE is enlightening. There are some interesting papers in the same issue, which has a special theme on the use and misuse of bibliometric indices in evaluating scholarly performance. Oh, and the article is published in an Open Access Journal too. Is it just me, or is there a strong smell of irony in here?

References

  1. Philip Campbell (2008). Escape from the impact factor Ethics in Science and Environmental Politics, 8, 5-7 DOI: 10.3354/esep00078
  2. Philip Campbell (1995). Postscript from a new hand Nature, 378 (6558), 649-649 DOI: 10.1038/378649b0
  3. John Sturges (1963) The Great Escape

June 23, 2009

Impact Factor Boxing 2009

Fight Night Punch Test by djclear904[This post is part of an ongoing series about impact factors]

The latest results from the annual impact factor boxing world championship contest are out. This is a combat sport where scientific journals are scored according to their supposed influence and impact in Science. This years competition rankings include the first-ever update to the newly introduced Five Year Impact Factor and Eigenfactor™ Metrics [1,2] in Journal Citation Reports (JCR) on the Web (see www.isiknowledge.com/JCR warning: clunky website requires subscription*), presumably in response to widespread criticism of impact factors. The Eigenfactor™ seems to correlate quite closely with the impact factor scores, both of which work at the level of the journal, although they use different methods for measuring a given journals impact. However, what many authors are often more interested in is the impact of an individual article, not the journal where it was published. So it would be interesting to see how the figures below tally with Google Scholar, see also comments by Abhishek Tiwari. I’ve included a table below of bioinformatics impact factors, updated for June 2009. Of course, when I say 2009 (today), I mean 2008 (these are the latest figures available based on data from 2007) – so this shiny new information published this week is already out of date [3] and flawed [4,5] but here is a selection of the data anyway: [update: see figures published in June 2010.]

Journal Title 2008 data from isiknowledge.com/JCR Eigenfactor™ Metrics
Total Cites Impact Factor 5-Year Impact Factor Immediacy Index Articles Cited Half-life Eigenfactor™ Score Article Influence™ Score
BMC Bionformatics 8141 3.781 4.246 0.664 607 2.8 0.06649 1.730
OUP Bioinformatics 30344 4.328 6.481 0.566 643 4.8 0.18204 2.593
Briefings in Bioinformatics 2908 4.627 1.273 44 4.5 0.02188
PLoS Computational Biology 2730 5.895 6.144 0.826 253 2.1 0.03063 3.370
Genome Biology 9875 6.153 7.812 0.961 229 4.4 0.07930 3.858
Nucleic Acids Research 86787 6.878 6.968 1.635 1070 6.5 0.37108 2.963
PNAS 416018 9.380 10.228 1.635 3508 7.4 1.69893 4.847
Science 409290 28.103 30.268 6.261 862 8.4 1.58344 16.283
Nature 443967 31.434 31.210 8.194 899 8.5 1.76407 17.278

The internet is radically changing the way we communicate and this includes scientific publishing, as media mogul Rupert Murdoch once pointed out big will not beat small any more – it will be the fast beating the slow.  An interesting question for publishers and scientists is, how can the Web help the faster flyweight and featherweight boxers (smaller journals) compete and punch-above-their-weight with the reigning world champion heavyweights (Nature, Science and PNAS)? Will the heavyweight publishers always have the killer knockout punches? If you’ve got access to the internet, then you already have a ringside seat from which to watch all the action. This fight should be entertaining viewing and there is an awful lot of money riding on the outcome [6-11].

Seconds away, round two…

References

  1. Fersht, A. (2009). The most influential journals: Impact Factor and Eigenfactor Proceedings of the National Academy of Sciences, 106 (17), 6883-6884 DOI: 10.1073/pnas.0903307106
  2. Bergstrom, C., & West, J. (2008). Assessing citations with the Eigenfactor Metrics Neurology, 71 (23), 1850-1851 DOI: 10.1212/01.wnl.0000338904.37585.66
  3. Cockerill, M. (2004). Delayed impact: ISI’s citation tracking choices are keeping scientists in the dark. BMC Bioinformatics, 5 (1) DOI: 10.1186/1471-2105-5-93
  4. Allen, L., Jones, C., Dolby, K., Lynn, D., & Walport, M. (2009). Looking for Landmarks: The Role of Expert Review and Bibliometric Analysis in Evaluating Scientific Publication Outputs PLoS ONE, 4 (6) DOI: 10.1371/journal.pone.0005910
  5. Grant, R.P. (2009) On article-level metrics and other animals Nature Network
  6. Corbyn, Z. (2009) Do academic journals pose a threat to the advancement of Science? Times Higher Education
  7. Fenner, M. (2009) PLoS ONE: Interview with Peter Binfield Gobbledygook blog at Nature Network
  8. Hoyt, J. (2009) Who is killing science on the Web? Publishers or Scientists? Mendeley Blog
  9. Hull, D. (2009) Escape from the Impact Factor: The Great Escape? O’Really? blog
  10. Murray-Rust, P. (2009) THE article: Do academic journals pose a threat to the advancement of science? Peter Murray-Rust’s blog: A Scientist and the Web
  11. Wu, S. (2009) The evolution of Scientific Impact shirleywho.wordpress.com

* This important data should be freely available (e.g. no subscription), since crucial decisions about the allocation of public money depend on it, but that’s another story.

[More commentary on this post over at friendfeed. CC-licensed Fight Night Punch Test by djclear904]

Blog at WordPress.com.