A talk I’ll be doing at the Wikipedia Science Conference in London on 2nd September.
There is growing interest in Wikipedia, Wikidata, Commons, and other Wikimedia projects as platforms for opening up the scientific process . The first Wikipedia Science Conference will discuss activities in this area at the Wellcome Collection Conference Centre in London on the 2nd & 3rd September 2015. There will be keynote talks from Wendy Hall (@DameWendyDBE) and Peter Murray-Rust (@petermurrayrust) and many other presentations including:
- Daniel Mietchen (@EvoMRI), National Institutes of Health: wikipedia and scholarly communication
- Alex Bateman (@AlexBateman1), European Bioinformatics Institute: Using wikipedia to annotate scientific databases
- Geoffrey Bilder (@GBilder), CrossRef, Using DOIs in wikipedia
- Richard Pinch (@IMAMaths), Institute of Mathematics and its Applications. Wikimedia versus academia: a clash of cultures
- Andy Mabbett (@PigsOnTheWing), Royal Society of Chemistry / ORCID. Wikipedia, Wikidata and more – How Can Scientists Help?
- Darren Logan (@DarrenLogan), Wellcome Trust Sanger Institute, Using scientific databases to annotate wikipedia
- Dario Taraborelli (@ReaderMeter), Wikimedia & Altmetrics, Citing as a public service
- … and many more
I’ll be doing a talk on “Improving the troubled relationship between scientists and wikipedia” (see picture top right) with help from John Byrne who has been a Wikipedian in Residence at the Royal Society and Cancer Research UK.
How much does finding out more about all this wiki-goodness cost? An absolute bargain at just £29 for two days – what’s not to like? Tickets are available on eventbrite, register now, while tickets are still available.
- Misha Teplitskiy, Grace Lu, & Eamon Duede (2015). Amplifying the Impact of Open Access: Wikipedia and the Diffusion of
Science Wikipedia Workshop at 9th International Conference on Web and Social Media (ICWSM), Oxford, UK arXiv: 1506.07608v1
[This post is part of an ongoing series about impact factors. See this post for the latest impact factors published in 2012.]
Roll up, roll up, ladies and gentlemen, Impact Factor Boxing is here again. As with last year (2009), the metrics used in this combat sport are already a year out of date. But this doesn’t stop many people from writing about impact factors and it’s been an interesting year  for the metrics used by many to judge the relative value of scientific work. The Public Library of Science (PLoS) launched their article level metrics within the last year following the example of BioMedCentral’s “most viewed” articles feature. Next to these new style metrics, the traditional impact factors live on, despite their limitations. Critics like Harold Varmus have recently pointed out that (quote):
“The impact factor is a completely flawed metric and it’s a source of a lot of unhappiness in the scientific community. Evaluating someone’s scientific productivity by looking at the number of papers they published in journals with impact factors over a certain level is poisonous to the system. A couple of folks are acting as gatekeepers to the distribution of information, and this is a very bad system. It really slows progress by keeping ideas and experiments out of the public domain until reviewers have been satisfied and authors are allowed to get their paper into the journal that they feel will advance their career.”
To be fair though, it’s not the metric that is flawed, more the way it is used (and abused) – a subject covered in much detail in a special issue of Nature at http://nature.com/metrics [2,3,4,5]. It’s much harder than it should be to get hold of these metrics, so I’ve reproduced some data below (fair use? I don’t know I am not a lawyer…) to minimise the considerable frustrations of using Journal Citation Reports (JCR).
Love them, loathe them, use them, abuse them, ignore them or obsess over them … here’s a small selection of the 7347 journals that are tracked in JCR ordered by increasing impact.
Maybe next year Thomson Reuters, who publish this data, could start attaching large government health warnings (like on cigarette packets) and long disclaimers to this data? WARNING: Abusing these figures can seriously damage your Science – you have been warned!
- Rizkallah, J., & Sin, D. (2010). Integrative Approach to Quality Assessment of Medical Journals Using Impact Factor, Eigenfactor, and Article Influence Scores PLoS ONE, 5 (4) DOI: 10.1371/journal.pone.0010204
- Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Metrics: Do metrics matter? Nature, 465 (7300), 860-862 DOI: 10.1038/465860a
- Van Noorden, R. (2010). Metrics: A profusion of measures Nature, 465 (7300), 864-866 DOI: 10.1038/465864a
- Braun, T., Osterloh, M., West, J., Rohn, J., Pendlebury, D., Bergstrom, C., & Frey, B. (2010). How to improve the use of metrics Nature, 465 (7300), 870-872 DOI: 10.1038/465870a
- Lane, J. (2010). Let’s make science metrics more scientific Nature, 464 (7288), 488-489 DOI: 10.1038/464488a
[Creative Commons licensed picture of Golden Gloves Prelim Bouts by Kate Gardiner ]