08 October 2013

Journal sting a black eye for Thomson Reuters

I’m grateful to Brian Wood on the Human Ecology blog for analyzing some of the data from Science magazine’s journal sting. He was curious is there was any relationship between the journal’s Impact Factor and the acceptance of Bohannon’s hoax paper.

57% of the journals without a listing in (Journal Citations Reports) Science accepted the paper, only 7% (n=3) of those in the database did so.

That any journal with an Impact Factor accepted the paper is a black eye for Thomson Reuters (the company that owns the Impact Factor), not just the journal. Journals are peer reviewed by Thomson Reuters before they are added to the Web of Knowledge. One of the things journals in the Web of Knowledge are supposed to have shown is that they are peer reviewed:

Application of the peer-review process is another indication of journal standards and signifies overall quality of the research presented and the completeness of cited references.

Brian goes on:

I compared the 2012 impact factors of listed journals who accepted the paper (n=3) with those who rejected it (n=34). The median impact factor of accepting journals was 0.994 and those who rejected was 1.3075 (p < 0.05 in a two-tailed t-test).

With a sample size of 3, I wouldn’t conclude anything about the predictive power of Impact Factor.

What I think the analysis shows is not that Impact Factor is useful, but that making the cut to be included in Web of Knowledge is useful. You might expect this: it’s a peer review system, just for journals instead of articles.

That peer review of journals failed to screen out all journals with inadequate peer review again demonstrates that peer review is imperfect.

This might also be a good time to remember that no journal comes into being with an Impact Factor. New journals will never establish themselves if scientists are waiting to see if Thomson Reuters thinks the journal is important enough to include in Web of Knowledge.

Related posts

Open access or vanity press, the Science “sting” edition
Aiming low

External links

Sting operation demonstrates the value of Journal Impact Factors
The Thomson Reuters journal selection process

1 comment:

Paul Vincelli said...

Peer review is certainly flawed. However, peer review is like democracy--a terrible system, but better than any of the current alternatives.

My experience as a scientific author, reviewer, and journal editor is that peer review generally helps to maintain high standards of scientific publication. Sure, some bad papers squeak through, though not too often in high-quality journals (and publishing scientists know which ones those are, because of the quality of the research we find in them). Some good manuscripts get rejected (but if they are good, they will always find a home in another journal).

The thing that scares me is science without peer review. I am active in outreach on climate change, and at least peer review by reputable journals provides a layer of protection from the junk peddled by denialists. (I use this word intentionally. A skeptic thoughtfully considers evidence. A denialist disregards evidence in order to promote a point of view.)

Paul Vincelli
University of Kentucky