Rocky Balboa, Philadelphia, PA. Creative Commons licensed picture by seng1011 (steve eng) on Flickr.
[This post is part of an ongoing series about impact factors]
In the world of abused performance metrics, the impact factor is the undisputed heavyweight champion of the (publishing) world.
It has been an eventful year in the boxing ring of scientific publishing since the last set of figures were published by Thomson-Reuters. A brand new journal called PeerJ launched with a radical publish ’til you perish business model . There’s another new journal on the way too in the shape of eLifeSciences - with it’s own significant differences from current publishing models. Then there was the Finch report on Open Access. If that wasn’t enough fun, there’s been the Alternative metrics “Altmetrics” movement gathering pace , alongside suggestions that the impact factor may be losing its grip on the supposed “title” .
The impact factors below are the most recent, published June 28th 2012, covering data from 2011. Love them or loathe them, use them or abuse them, game them or shame them … here is a tiny selection of impact factors for the 10,675 journals that are tracked in Journal Citation Reports (JCR) ordered by increasing punch power.
WARNING: Abusing these figures can seriously damage your Science – you have been warned! Normal caveats apply, see nature.com/metrics.
* The Russian Journal of Cardiology is included here for reference as it has the lowest non-zero impact factor of any science journal. A rather dubious honour…
** The Cancer Journal for Clinicians is the highest ranked journal in science, it is included here for reference. Could it be the first journal to have an impact factor of more than 100?
- Richard Van Noorden (2012). Journal offers flat fee for ‘all you can publish’, Nature, 486 (7402) 166. DOI: 10.1038/486166a
- Jason Priem, Heather Piwowar and Bradley Hemminger (2012). Altmetrics in the wild: Using social media to explore scholarly impact arxiv.org/abs/1203.4745
- George Lozano, Vincent Lariviere and Yves Gingras (2012). The weakening relationship between the Impact Factor and papers’ citations in the digital age arxiv.org/abs/1205.4328
[This post is part of an ongoing series about impact factors. See Impact Factor Boxing 2012 for the latest figures.]
Well it’s that time again. The annual sweaty fist-fight for supremacy between the scientific journals, as measured by impact factors, is upon us. Much ink (virtual and actual) has been spilt on the subject of impact factors, which we won’t add to here, other than to say:
Hey look, the “European” journals might be catching up with the “American” ones. 
So, love them, loathe them, use them, abuse them, ignore them or obsess over them… here’s a tiny selection of the 10,196 journals that are tracked in Journal Citation Reports (JCR) ordered by increasing impact.
WARNING: Abusing these figures can seriously damage your Science – you have been warned! (normal caveats apply)
* The Naval Architect is included here for reference as it has the lowest non-zero impact factor of any science journal. A rather dubious honour…
** The Cancer Journal for Clinicians is the highest ranked journal in science, is included here for reference.
[Creative Commons licensed picture of Khmer boxing picture by lecercle]
- Karageorgopoulos, D., Lamnatou, V., Sardi, T., Gkegkes, I., & Falagas, M. (2011). Temporal Trends in the Impact Factor of European versus USA Biomedical Journals PLoS ONE, 6 (2) DOI: 10.1371/journal.pone.0016300
[This post is part of an ongoing series about impact factors. See this post for the latest impact factors published in 2012.]
Roll up, roll up, ladies and gentlemen, Impact Factor Boxing is here again. As with last year (2009), the metrics used in this combat sport are already a year out of date. But this doesn’t stop many people from writing about impact factors and it’s been an interesting year  for the metrics used by many to judge the relative value of scientific work. The Public Library of Science (PLoS) launched their article level metrics within the last year following the example of BioMedCentral’s “most viewed” articles feature. Next to these new style metrics, the traditional impact factors live on, despite their limitations. Critics like Harold Varmus have recently pointed out that (quote):
“The impact factor is a completely flawed metric and it’s a source of a lot of unhappiness in the scientific community. Evaluating someone’s scientific productivity by looking at the number of papers they published in journals with impact factors over a certain level is poisonous to the system. A couple of folks are acting as gatekeepers to the distribution of information, and this is a very bad system. It really slows progress by keeping ideas and experiments out of the public domain until reviewers have been satisfied and authors are allowed to get their paper into the journal that they feel will advance their career.”
To be fair though, it’s not the metric that is flawed, more the way it is used (and abused) – a subject covered in much detail in a special issue of Nature at http://nature.com/metrics [2,3,4,5]. It’s much harder than it should be to get hold of these metrics, so I’ve reproduced some data below (fair use? I don’t know I am not a lawyer…) to minimise the considerable frustrations of using Journal Citation Reports (JCR).
Love them, loathe them, use them, abuse them, ignore them or obsess over them … here’s a small selection of the 7347 journals that are tracked in JCR ordered by increasing impact.
Maybe next year Thomson Reuters, who publish this data, could start attaching large government health warnings (like on cigarette packets) and long disclaimers to this data? WARNING: Abusing these figures can seriously damage your Science – you have been warned!
- Rizkallah, J., & Sin, D. (2010). Integrative Approach to Quality Assessment of Medical Journals Using Impact Factor, Eigenfactor, and Article Influence Scores PLoS ONE, 5 (4) DOI: 10.1371/journal.pone.0010204
- Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Metrics: Do metrics matter? Nature, 465 (7300), 860-862 DOI: 10.1038/465860a
- Van Noorden, R. (2010). Metrics: A profusion of measures Nature, 465 (7300), 864-866 DOI: 10.1038/465864a
- Braun, T., Osterloh, M., West, J., Rohn, J., Pendlebury, D., Bergstrom, C., & Frey, B. (2010). How to improve the use of metrics Nature, 465 (7300), 870-872 DOI: 10.1038/465870a
- Lane, J. (2010). Let’s make science metrics more scientific Nature, 464 (7288), 488-489 DOI: 10.1038/464488a
[Creative Commons licensed picture of Golden Gloves Prelim Bouts by Kate Gardiner ]
[This post is part of an ongoing series about impact factors]
The latest results from the annual impact factor boxing world championship contest are out. This is a combat sport where scientific journals are scored according to their supposed influence and impact in Science. This years competition rankings include the first-ever update to the newly introduced Five Year Impact Factor and Eigenfactor™ Metrics [1,2] in Journal Citation Reports (JCR) on the Web (see www.isiknowledge.com/JCR warning: clunky website requires subscription*), presumably in response to widespread criticism of impact factors. The Eigenfactor™ seems to correlate quite closely with the impact factor scores, both of which work at the level of the journal, although they use different methods for measuring a given journals impact. However, what many authors are often more interested in is the impact of an individual article, not the journal where it was published. So it would be interesting to see how the figures below tally with Google Scholar, see also comments by Abhishek Tiwari. I’ve included a table below of bioinformatics impact factors, updated for June 2009. Of course, when I say 2009 (today), I mean 2008 (these are the latest figures available based on data from 2007) – so this shiny new information published this week is already out of date  and flawed [4,5] but here is a selection of the data anyway: [update: see figures published in June 2010.]
The internet is radically changing the way we communicate and this includes scientific publishing, as media mogul Rupert Murdoch once pointed out big will not beat small any more – it will be the fast beating the slow. An interesting question for publishers and scientists is, how can the Web help the faster flyweight and featherweight boxers (smaller journals) compete and punch-above-their-weight with the reigning world champion heavyweights (Nature, Science and PNAS)? Will the heavyweight publishers always have the killer knockout punches? If you’ve got access to the internet, then you already have a ringside seat from which to watch all the action. This fight should be entertaining viewing and there is an awful lot of money riding on the outcome [6-11].
Seconds away, round two…
- Fersht, A. (2009). The most influential journals: Impact Factor and Eigenfactor Proceedings of the National Academy of Sciences, 106 (17), 6883-6884 DOI: 10.1073/pnas.0903307106
- Bergstrom, C., & West, J. (2008). Assessing citations with the Eigenfactor Metrics Neurology, 71 (23), 1850-1851 DOI: 10.1212/01.wnl.0000338904.37585.66
- Cockerill, M. (2004). Delayed impact: ISI’s citation tracking choices are keeping scientists in the dark. BMC Bioinformatics, 5 (1) DOI: 10.1186/1471-2105-5-93
- Allen, L., Jones, C., Dolby, K., Lynn, D., & Walport, M. (2009). Looking for Landmarks: The Role of Expert Review and Bibliometric Analysis in Evaluating Scientific Publication Outputs PLoS ONE, 4 (6) DOI: 10.1371/journal.pone.0005910
- Grant, R.P. (2009) On article-level metrics and other animals Nature Network
- Corbyn, Z. (2009) Do academic journals pose a threat to the advancement of Science? Times Higher Education
- Fenner, M. (2009) PLoS ONE: Interview with Peter Binfield Gobbledygook blog at Nature Network
- Hoyt, J. (2009) Who is killing science on the Web? Publishers or Scientists? Mendeley Blog
- Hull, D. (2009) Escape from the Impact Factor: The Great Escape? O’Really? blog
- Murray-Rust, P. (2009) THE article: Do academic journals pose a threat to the advancement of science? Peter Murray-Rust’s blog: A Scientist and the Web
- Wu, S. (2009) The evolution of Scientific Impact shirleywho.wordpress.com
* This important data should be freely available (e.g. no subscription), since crucial decisions about the allocation of public money depend on it, but that’s another story.
[More commentary on this post over at friendfeed. CC-licensed Fight Night Punch Test by djclear904]