57% of the journals without a listing in (Journal Citations Reports) Science accepted the paper, only 7% (n=3) of those in the database did so.
That any journal with an Impact Factor accepted the paper is a black eye for Thomson Reuters (the company that owns the Impact Factor), not just the journal. Journals are peer reviewed by Thomson Reuters before they are added to the Web of Knowledge. One of the things journals in the Web of Knowledge are supposed to have shown is that they are peer reviewed:
Application of the peer-review process is another indication of journal standards and signifies overall quality of the research presented and the completeness of cited references.
Brian goes on:
I compared the 2012 impact factors of listed journals who accepted the paper (n=3) with those who rejected it (n=34). The median impact factor of accepting journals was 0.994 and those who rejected was 1.3075 (p < 0.05 in a two-tailed t-test).
With a sample size of 3, I wouldn’t conclude anything about the predictive power of Impact Factor.
What I think the analysis shows is not that Impact Factor is useful, but that making the cut to be included in Web of Knowledge is useful. You might expect this: it’s a peer review system, just for journals instead of articles.
That peer review of journals failed to screen out all journals with inadequate peer review again demonstrates that peer review is imperfect.
This might also be a good time to remember that no journal comes into being with an Impact Factor. New journals will never establish themselves if scientists are waiting to see if Thomson Reuters thinks the journal is important enough to include in Web of Knowledge.
Open access or vanity press, the Science “sting” edition
Sting operation demonstrates the value of Journal Impact Factors
The Thomson Reuters journal selection process