In our most recent paper on nociception, one of the major points is that not all animals react to potentially nasty stimuli the same way. And it turns out that there’s a very nice demonstration of that idea on Mythbusters. In the 2008 Shark Week “Jawsome Special,” the Mythbusters did a segment called “Spicy Salsa Shark Shield.”
They showed that sharks were not deterred by the presence of capsaicin-laden material.
I included this in the references as:
Dallow, A. and Lentle, T. (2008). Mythbusters: Shark Week Special 2, Episode 102. Discovery Channel, USA: Beyond Entertainment Limited.
The journal copy editors wrote:
Only peer-reviewed references are permitted in the reference list.
This policy is nowhere to be found in the journal’s instructions to authors.
I took it out, because there were other references that made the point about how responses to noxious stimuli varied from species to species. It wasn’t worth fighting over, and I think I’d have lost.
Regardless of the esteem you hold for the work done by Mythbusters, the journal’s citation policing raises bigger issues. A current trend in academic publishing is to broaden the kinds of research products that people can get academic credit for. Why should only publications “count,” and not sharing a database, or writing useful code?
Consider figshare, the cloud storage data archiving service. One of their big selling points as a data repository is that they generate digital object identifiers (DOI) for stuff submitted there. The DOI itself is not the selling point, but they strongly imply that this makes whatever is archived on Figshare citable in scholarly publications. Here’s the top from their “about” page: making things citable is the first thing the list. (my emphasis):
figshare is a repository where users can make all of their research outputs available in a citable, shareable and discoverable manner.
But all those hopes can be dashed by a single sentence from a journal. “Nope, we only take peer reviewed papers.”
Interestingly, the journal let us keep in a couple of conference abstracts. The abstracts were in published in a peer-reviewed society journal, but the abstracts themselves were not peer-reviewed.
I’ve noted before that some journals have had a tradition of allowing people to cite “grey literature,” like conference abstracts, newspaper articles, web pages, or tweets. I think that is a positive thing. I worry that this sort of “journals citing (and therefore promoting) journals” policy might become more common as journals compete for scientific products. Such policies could hamper the development of academic publishing innovations.
Even worse, a “journal only” policy has the potential to force authors into intellectual dishonesty. “We got this idea from another lab’s data on figshare, but we can’t say that in the paper, so we’ll just have to say something else.”
Authors and possibly reviewers should be determining what is a legitimate citation on a case by case basis, rather than a journal setting a blanket policy favourable to itself.