O'Really?

May 11, 2012

Journal Fire: Bonfire of the Vanity Journals?

Fire by John Curley on Flickr

Fire by John Curley, available via Creative Commons license.

When I first heard about Journal Fire, I thought, Great! someone is going to take all the closed-access scientific journals and make a big bonfire of them! At the top of this bonfire would be the burning effigy of a wicker man, representing the very worst of the vanity journals [1,2].

Unfortunately Journal Fire aren’t burning anything just yet, but what they are doing is something just as interesting. Their web based application allows you to manage and share your journal club online. I thought I’d give it a whirl because a friend of mine asked me what I thought about a paper on ontologies in biodiversity [3]. Rather than post a brief review here, I’ve posted it over at Journal Fire. Here’s some initial thoughts on a quick test drive of their application:

Pros

On the up side Journal Fire:

  • Is a neutral-ish third party space where anyone can discuss scientific papers.
  • Understands common identifiers (DOI and PMID) to tackle the identity crisis.
  • Allows you to post simple anchor links in reviews, but not much else, see below.
  • Does not require you to use cumbersome syntax used in ResearchBlogging [4], ScienceSeeker and elsewhere
  • Is integrated with citeulike, for those that use it
  • It can potentially provide many different reviews of a given paper in one place
  • Is web-based, so you don’t have to download and install any software, unlike alternative desktop systems Mendeley and Utopia docs

Cons

On the down side Journal Fire:

  • Is yet another piece social software for scientists. Do we really need more, when we’ve had far too many already?
  • Requires you to sign up for an account without  re-using your existing digital identity with Google, Facebook, Twitter etc.
  • Does not seem to have many people on it (yet) despite the fact it has been going since at least since 2007.
  • Looks a bit stale, the last blog post was published in 2010. Although the software still works fine, it is not clear if it is being actively maintained and developed.
  • Does not allow much formatting in reviews besides simple links, something like markdown would be good.
  • Does not understand or import arXiv identifiers, at the moment.
  • As far as I can see, Journal Fire is a small startup based in Pasadena, California. Like all startups, they might go bust. If this happens, they’ll take your journal club, and all its reviews down with them.

I think the pros mostly outweigh the cons, so if you like the idea of a third-party hosting your journal club, Journal Fire is worth a trial run.

References

  1. Juan Carlos Lopez (2009) We want your paper! The similarity between high-end restaurants and scientific journals Spoonful of Medicine, a blog from Nature Medicine
  2. NOTE: Vanity journals should not to be confused with the The Vanity Press.
  3. Andrew R. Deans, Matthew J. Yoder & James P. Balhoff (2012). Time to change how we describe biodiversity, Trends in Ecology & Evolution, 27 (2) 84. DOI: 10.1016/j.tree.2011.11.007
  4. Shema, H., Bar-Ilan, J., & Thelwall, M. (2012). Research Blogs and the Discussion of Scholarly Information PLoS ONE, 7 (5) DOI: 10.1371/journal.pone.0035869

February 15, 2012

The Open Access Irony Awards: Naming and shaming them

Ask me about open access by mollyaliOpen Access (OA) publishing aims to make the results of scientific research available to the widest possible audience. Scientific papers that are published in Open Access journals are freely available for crucial data mining and for anyone or anything to read, wherever they may be.

In the last ten years, the Open Access movement has made huge progress in allowing:

“any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers.”

But there is still a long way to go yet, as much of the world’s scientific knowledge remains locked up behind publisher’s paywalls, unavailable for re-use by text-mining software and inaccessible to the public, who often funded the research through taxation.

Openly ironic?

ironicIronically, some of the papers that are inaccessible discuss or even champion the very Open Access movement itself. Sometimes the lack of access is deliberate, other times accidental – but the consequences are serious. Whether deliberate or accidental, restricted access to public scientific knowledge is slowing scientific progress [1]. Sometimes the best way to make a serious point is to have a laugh and joke about it. This is what the Open Access Irony Awards do, by gathering all the offenders in one place, we can laugh and make a serious point at the same time by naming and shaming the papers in question.

To get the ball rolling, here is are some examples:

  • The Lancet owned by Evilseviersorry I mean Elsevier, recently  published a paper on “the case for open data” [2] (please login to access article). Login?! Not very open…
  • Serial offender and über-journal Science has an article by Elias Zerhouni on the NIH public access policy [3] (Subscribe/Join AAAS to View Full Text), another on “making data maximally available” [4] (Subscribe/Join AAAS to View Full Text) and another on a high profile advocate of open science [5] (Buy Access to This Article to View Full Text) Irony of ironies.
  • From Nature Publishing Group comes a fascinating paper about harnessing the wisdom of the crowds to predict protein structures [6]. Not only have members of the tax-paying public funded this work, they actually did some of the work too! But unfortunately they have to pay to see the paper describing their results. Ironic? Also, another published in Nature Medicine proclaims the “delay in sharing research data is costing lives” [1] (instant access only $32!)
  • From the British Medical Journal (BMJ) comes the worrying news of dodgy American laws that will lock up valuable scientific data behind paywalls [7] (please subscribe or pay below). Ironic? *
  • The “green” road to Open Access publishing involves authors uploading their manuscript to self-archive the data in some kind of  public repository. But there are many social, political and technical barriers to this, and they have been well documented [8]. You could find out about them in this paper [8], but it appears that the author hasn’t self-archived the paper or taken the “gold” road and pulished in an Open Access journal. Ironic?
  • Last, but not least, it would be interesting to know what commercial publishers make of all this text-mining magic in Science [9], but we would have to pay $24 to find out. Ironic?

These are just a small selection from amongst many. If you would like to nominate a paper for an Open Access Irony Award, simply post it to the group on Citeulike or group on Mendeley. Please feel free to start your own group elsewhere if you’re not on Citeulike or Mendeley. The name of this award probably originated from an idea Jonathan Eisen, picked up by Joe Dunckley and Matthew Cockerill at BioMed Central (see tweet below). So thanks to them for the inspiration.

For added ironic amusement, take a screenshot of the offending article and post it to the Flickr group. Sometimes the shame is too much, and articles are retrospectively made open access so a screenshot will preserve the irony.

Join us in poking fun at the crazy business of academic publishing, while making a serious point about the lack of Open Access to scientific data.

References

  1. Sommer, Josh (2010). The delay in sharing research data is costing lives Nature Medicine, 16 (7), 744-744 DOI: 10.1038/nm0710-744
  2. Boulton, G., Rawlins, M., Vallance, P., & Walport, M. (2011). Science as a public enterprise: the case for open data The Lancet, 377 (9778), 1633-1635 DOI: 10.1016/S0140-6736(11)60647-8
  3. Zerhouni, Elias (2004). Information Access: NIH Public Access Policy Science, 306 (5703), 1895-1895 DOI: 10.1126/science.1106929
  4. Hanson, B., Sugden, A., & Alberts, B. (2011). Making Data Maximally Available Science, 331 (6018), 649-649 DOI: 10.1126/science.1203354
  5. Kaiser, Jocelyn (2012). Profile of Stephen Friend at Sage Bionetworks: The Visionary Science, 335 (6069), 651-653 DOI: 10.1126/science.335.6069.651
  6. Cooper, S., Khatib, F., Treuille, A., Barbero, J., Lee, J., Beenen, M., Leaver-Fay, A., Baker, D., Popović, Z., & players, F. (2010). Predicting protein structures with a multiplayer online game Nature, 466 (7307), 756-760 DOI: 10.1038/nature09304
  7. Epstein, Keith (2012). Scientists are urged to oppose new US legislation that will put studies behind a pay wall BMJ, 344 (jan17 3) DOI: 10.1136/bmj.e452
  8. Kim, Jihyun (2010). Faculty self-archiving: Motivations and barriers Journal of the American Society for Information Science and Technology DOI: 10.1002/asi.21336
  9. Smit, Eefke, & Van Der Graaf, M. (2012). Journal article mining: the scholarly publishers’ perspective Learned Publishing, 25 (1), 35-46 DOI: 10.1087/20120106

[CC licensed picture “ask me about open access” by mollyali.]

* Please note, some research articles in BMJ are available by Open Access, but news articles like [7] are not. Thanks to Trish Groves at BMJ for bringing this to my attention after this blog post was published. Also, some “articles” here are in a grey area for open access, particularly “journalistic” stuff like news, editorials and correspondence, as pointed out by Becky Furlong. See tweets below…

June 28, 2011

Impact Factor Boxing 2011

Khmer Boxing by  lecercle, on Flickr[This post is part of an ongoing series about impact factors. See Impact Factor Boxing 2012 for the latest figures.]

Well it’s that time again. The annual sweaty fist-fight for supremacy between the scientific journals, as measured by impact factors, is upon us. Much ink (virtual and actual) has been spilt on the subject of impact factors, which we won’t add to here, other than to say:

Hey look, the “European” journals might be catching up with the “American” ones. [1]

So, love them, loathe them, use them, abuse them, ignore them or obsess over them… here’s a tiny selection of the 10,196 journals that are tracked in Journal Citation Reports (JCR) ordered by increasing impact.

WARNING: Abusing these figures can seriously damage your Science – you have been warned! (normal caveats apply)

Journal 2010 data from isiknowledge.com/JCR Eigenfactor™ Metrics
Total Cites Impact Factor 5-Year Impact Factor Immediacy Index Articles Cited Half-life Eigenfactor™ Score Article Influence™ Score
The Naval Architect* 16 0.005 0.004 0.005 189 0.00002 0.001
BMC Bioinformatics 12653 3.028 3.786 0.475 690 3.9 0.08086 1.495
PLoS ONE 42795 4.411 4.610 0.515 6714 2.1 0.32121 1.943
OUP Bioinformatics 40659 4.877 6.325 0.707 700 5.7 0.17973 2.649
PLoS Computational Biololgy 6849 5.515 6.251 0.727 406 2.8 0.06075 2.984
Genome Biology 14194 6.885 7.353 1.295 173 4.9 0.07688 3.585
Nucleic Acids Research 100444 7.836 7.314 1.755 1101 7.0 0.32867 3.016
Briefings in Bioinformatics 2886 9.283 7.395 1.204 49 5.8 0.01013 2.737
PLoS Biology 18453 12.469 14.375 2.706 214 4.1 0.16084 8.225
Science 469704 31.364 31.769 6.789 862 9.0 1.46485 16.859
Nature 511145 36.101 35.241 8.791 862 9.1 1.74466 19.334
New England Journal of Medicine 227674 53.484 52.362 10.675 345 7.5 0.69167 21.366
CA – A Cancer Journal for Clinicians ** 9801 94.262 70.216 8.667 18 3.8 0.04923 24.782

* The Naval Architect is included here for reference as it has the lowest non-zero impact factor of any science journal. A rather dubious honour…

** The Cancer Journal for Clinicians is the highest ranked journal in science, is included here for reference.

[Creative Commons licensed picture of Khmer boxing picture by lecercle]

References

  1. Karageorgopoulos, D., Lamnatou, V., Sardi, T., Gkegkes, I., & Falagas, M. (2011). Temporal Trends in the Impact Factor of European versus USA Biomedical Journals PLoS ONE, 6 (2) DOI: 10.1371/journal.pone.0016300

September 1, 2010

How many unique papers are there in Mendeley?

Lex Macho Inc. by Dan DeChiaro on Flickr, How many people in this picture?Mendeley is a handy piece of desktop and web software for managing and sharing research papers [1]. This popular tool has been getting a lot of attention lately, and with some impressive statistics it’s not difficult to see why. At the time of writing Mendeley claims to have over 36 million papers, added by just under half a million users working at more than 10,000 research institutions around the world. That’s impressive considering the startup company behind it have only been going for a few years. The major established commercial players in the field of bibliographic databases (WoK and Scopus) currently have around 40 million documents, so if Mendeley continues to grow at this rate, they’ll be more popular than Jesus (and Elsevier and Thomson) before you can say “bibliography”. But to get a real handle on how big Mendeley is we need to know how many of those 36 million documents are unique because if there are lots of duplicated documents then it will affect the overall head count. (more…)

July 27, 2010

Twenty million papers in PubMed: a triumph or a tragedy?

pubmed.govA quick search on pubmed.gov today reveals that the freely available American database of biomedical literature has just passed the 20 million citations mark*. Should we celebrate or commiserate passing this landmark figure? Is it a triumph or a tragedy that PubMed® is the size it is? (more…)

July 26, 2010

Please Sir, I want some more Science!

Science Online London 2010 (soloconf)Science Online London (#solo10 September 3-4, 2010) is an annual gathering of people interested in the use of web technologies for scientific collaboration and communication.  The organisers at Mendeley, Nature Network and The British Library continue to do a great job of hosting this important gathering, now in its third year:

I’ve been the last two years (2008 and 2009), and it has been worth attending because of the mix speakers, delegates and topics covered. This year includes talks from:

See the impressive full programme here. Reading through the speaker list I wondered, where are all the scientists at science online this year? At the time of writing this, 12 of the 13 speakers are politicians, publishers or journalists with scientist Peter Murray-Rust the odd man out. I’ve nothing against politicians, publishers or journalists but it would be great to have a more balanced event this year. The UK is full of high-profile scientists with blogs who would probably jump at the opportunity to speak at this event. So:

Or as the skeptical Sid Rodrigues said “this looks like fun, needs more nerds though“…

July 15, 2010

How many journal articles have been published (ever)?

Fifty Million and Fifty Billion by ZeroOne

According to some estimates, there are fifty million articles in existence as of 2010. Picture of a fifty million dollar note by ZeroOne on Flickr.

Earlier this year, the scientific journal PLoS ONE published their 10,000th article. Ten thousand articles is a lot of papers especially when you consider that PLoS ONE only started publishing four short years ago in 2006. But scientists have been publishing in journals for at least 350 years [1] so it might make you wonder, how many articles have been published in scientific and learned journals since time began?

If we look at PubMed Central, a full-text archive of journals freely available to all – PubMedCentral currently holds over 1.7 million articles. But these articles are only a tiny fraction of the total literature – since a lot of the rest is locked up behind publishers paywalls and is inaccessible to many people. (more…)

June 22, 2010

Impact Factor Boxing 2010

Golden Gloves Prelim Bouts by Kate Gardiner[This post is part of an ongoing series about impact factors. See this post for the latest impact factors published in 2012.]

Roll up, roll up, ladies and gentlemen, Impact Factor Boxing is here again. As with last year (2009), the metrics used in this combat sport are already a year out of date. But this doesn’t stop many people from writing about impact factors and it’s been an interesting year [1] for the metrics used by many to judge the relative value of scientific work. The Public Library of Science (PLoS) launched their article level metrics within the last year following the example of BioMedCentral’s “most viewed” articles feature. Next to these new style metrics, the traditional impact factors live on, despite their limitations. Critics like Harold Varmus have recently pointed out that (quote):

“The impact factor is a completely flawed metric and it’s a source of a lot of unhappiness in the scientific community. Evaluating someone’s scientific productivity by looking at the number of papers they published in journals with impact factors over a certain level is poisonous to the system. A couple of folks are acting as gatekeepers to the distribution of information, and this is a very bad system. It really slows progress by keeping ideas and experiments out of the public domain until reviewers have been satisfied and authors are allowed to get their paper into the journal that they feel will advance their career.”

To be fair though, it’s not the metric that is flawed, more the way it is used (and abused) – a subject covered in much detail in a special issue of Nature at http://nature.com/metrics [2,3,4,5]. It’s much harder than it should be to get hold of these metrics, so I’ve reproduced some data below (fair use? I don’t know I am not a lawyer…) to minimise the considerable frustrations of using Journal Citation Reports (JCR).

Love them, loathe them, use them, abuse them, ignore them or obsess over them … here’s a small selection of the 7347 journals that are tracked in JCR  ordered by increasing impact.

Journal Title 2009 data from isiknowledge.com/JCR Eigenfactor™ Metrics
Total Cites Impact Factor 5-Year Impact Factor Immediacy Index Articles Cited Half-life Eigenfactor™  Score Article Influence™ Score
RSC Integrative Biology 34 0.596 57 0.00000
Communications of the ACM 13853 2.346 3.050 0.350 177 >10.0 0.01411 0.866
IEEE Intelligent Systems 2214 3.144 3.594 0.333 33 6.5 0.00447 0.763
Journal of Web Semantics 651 3.412 0.107 28 4.6 0.00222
BMC Bionformatics 10850 3.428 4.108 0.581 651 3.4 0.07335 1.516
Journal of Molecular Biology 69710 3.871 4.303 0.993 916 9.2 0.21679 2.051
Journal of Chemical Information and Modeling 8973 3.882 3.631 0.695 266 5.9 0.01943 0.772
Journal of the American Medical Informatics Association (JAMIA) 4183 3.974 5.199 0.705 105 5.7 0.01366 1.585
PLoS ONE 20466 4.351 4.383 0.582 4263 1.7 0.16373 1.918
OUP Bioinformatics 36932 4.926 6.271 0.733 677 5.2 0.16661 2.370
Biochemical Journal 50632 5.155 4.365 1.262 455 >10.0 0.10896 1.787
BMC Biology 1152 5.636 0.702 84 2.7 0.00997
PLoS Computational Biology 4674 5.759 6.429 0.786 365 2.5 0.04369 3.080
Genome Biology 12688 6.626 7.593 1.075 186 4.8 0.08005 3.586
Trends in Biotechnology 8118 6.909 8.588 1.407 81 6.4 0.02402 2.665
Briefings in Bioinformatics 2898 7.329 16.146 1.109 55 5.3 0.01928 5.887
Nucleic Acids Research 95799 7.479 7.279 1.635 1070 6.5 0.37108 2.963
PNAS 451386 9.432 10.312 1.805 3765 7.6 1.68111 4.857
PLoS Biology 15699 12.916 14.798 2.692 195 3.5 0.17630 8.623
Nature Biotechnology 31564 29.495 27.620 5.408 103 5.7 0.14503 11.803
Science 444643 29.747 31.052 6.531 897 8.8 1.52580 16.570
Cell 153972 31.152 32.628 6.825 359 8.7 0.70117 20.150
Nature 483039 34.480 32.906 8.209 866 8.9 1.74951 18.054
New England Journal of Medicine 216752 47.050 51.410 14.557 352 7.5 0.67401 19.870

Maybe next year Thomson Reuters, who publish this data, could start attaching large government health warnings (like on cigarette packets) and long disclaimers to this data? WARNING: Abusing these figures can seriously damage your Science – you have been warned!

References

  1. Rizkallah, J., & Sin, D. (2010). Integrative Approach to Quality Assessment of Medical Journals Using Impact Factor, Eigenfactor, and Article Influence Scores PLoS ONE, 5 (4) DOI: 10.1371/journal.pone.0010204
  2. Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Metrics: Do metrics matter? Nature, 465 (7300), 860-862 DOI: 10.1038/465860a
  3. Van Noorden, R. (2010). Metrics: A profusion of measures Nature, 465 (7300), 864-866 DOI: 10.1038/465864a
  4. Braun, T., Osterloh, M., West, J., Rohn, J., Pendlebury, D., Bergstrom, C., & Frey, B. (2010). How to improve the use of metrics Nature, 465 (7300), 870-872 DOI: 10.1038/465870a
  5. Lane, J. (2010). Let’s make science metrics more scientific Nature, 464 (7288), 488-489 DOI: 10.1038/464488a

[Creative Commons licensed picture of Golden Gloves Prelim Bouts by Kate Gardiner ]

April 30, 2010

Daniel Cohen on The Social Life of Digital Libraries

Day 106 - I am a librarian by cindiann, on FlickrDaniel Cohen is giving a talk in Cambridge today on The Social Life of Digital Libraries, abstract below:

The digitization of libraries had a clear initial goal: to permit anyone to read the contents of collections anywhere and anytime. But universal access is only the beginning of what may happen to libraries and researchers in the digital age. Because machines as well as humans have access to the same online collections, a complex web of interactions is emerging. Digital libraries are now engaging in online relationships with other libraries, with scholars, and with software, often without the knowledge of those who maintain the libraries, and in unexpected ways. These digital relationships open new avenues for discovery, analysis, and collaboration.

Daniel J. Cohen is an Associate Professor at George Mason University and has been involved in the development of the Zotero extension for the Firefox browser that enables users to manage bibliographic data while doing online research. Zotero [1] is one of many new tools [2] that are attempting to add a social dimension to scholarly information on the Web, so this should be an interesting talk.

If you’d like to come, the talk starts at 6pm in Clare College, Cambridge and you need to RSVP by email via the talks.cam.ac.uk page

References

  1. Cohen, D.J. (2008). Creating scholarly tools and resources for the digital ecosystem: Building connections in the Zotero project. First Monday 13 (8)
  2. Hull, D., Pettifer, S., & Kell, D. (2008). Defrosting the Digital Library: Bibliographic Tools for the Next Generation Web PLoS Computational Biology, 4 (10) DOI: 10.1371/journal.pcbi.1000204

April 28, 2010

Philip Campbell on Science Facts and Frictions

Philip Campbell: Will you pay for good online stuff, Dammit? (Libraries do, thankfully)As part of the Gates Distinguished Lecture Series editor Philip Campbell is giving a public lecture at 6.30pm tonight titled Science – facts and frictions at Emmanuel College, Cambridge. The abstract and text below is reproduced from talks.cam.ac.uk:

Climategate’, MMR vaccine, GM crops, stem cells – these are examples of public debates in which science and scientists have come under attack. And yet the processes of science were no different in kind from those in calmer territories, such as cancer research, where the public not only trusts researchers but directly donates half a billion pounds every year in their support. Why are there such contrasts? And what can scientists and others do in response to such attacks? The talk will offer some suggestions.

As Editor-in-Chief of Nature, Philip Campbell heads a team of about 90 editorial staff around the world. Dr. Campbell takes direct editorial responsibility for the content of Nature editorials, writing some of them. He is the seventh [1] Editor-in-Chief since the journal was launched in 1869.

Dr. Campbell’s role as Editor-in-Chief of Nature publications (of which there are many editorially independent journals and several websites) is to ensure that the quality and integrity appropriate to the Nature name are maintained, and that appropriate individuals are appointed as chief editors. He sits on the executive board of Nature’s parent company, Nature Publishing Group.

According to the accompanying press release from the University, Campbell:

“is particularly interested in groups of scientists who regularly produce blogs in order to help the public and journalists gain access to their perspectives on scientific developments and controversies.”

So, if you’re in or near Cambridge tonight, this talk is open the public and looks like it will be enlightening.

[Update, some interesting things mentioned in this talk in no particular order:

Refererences

  1. Philip Campbell (1995). Postscript from a new hand Nature, 378 (6558), 649-649 DOI: 10.1038/378649b0
  2. Daniel Sarewitz (2004). How science makes environmental controversies worse Environmental Science & Policy, 7 (5), 385-403 DOI: 10.1016/j.envsci.2004.06.001
« Previous PageNext Page »

Blog at WordPress.com.