13 April 2015

Why can’t I cite Mythbusters?

I’m sorry, Adam and Jamie. I tried.

In our most recent paper on nociception, one of the major points is that not all animals react to potentially nasty stimuli the same way. And it turns out that there’s a very nice demonstration of that idea on Mythbusters. In the 2008 Shark Week “Jawsome Special,” the Mythbusters did a segment called “Spicy Salsa Shark Shield.”

They showed that sharks were not deterred by the presence of capsaicin-laden material.

I included this in the references as:

Dallow, A. and Lentle, T. (2008). Mythbusters: Shark Week Special 2, Episode 102. Discovery Channel, USA: Beyond Entertainment Limited.

The journal copy editors wrote:

Only peer-reviewed references are permitted in the reference list.

This policy is nowhere to be found in the journal’s instructions to authors.

I took it out, because there were other references that made the point about how responses to noxious stimuli varied from species to species. It wasn’t worth fighting over, and I think I’d have lost.

Regardless of the esteem you hold for the work done by Mythbusters, the journal’s citation policing raises bigger issues. A current trend in academic publishing is to broaden the kinds of research products that people can get academic credit for. Why should only publications “count,” and not sharing a database, or writing useful code?

Consider figshare, the cloud storage data archiving service. One of their big selling points as a data repository is that they generate digital object identifiers (DOI) for stuff submitted there. The DOI itself is not the selling point, but they strongly imply that this makes whatever is archived on Figshare citable in scholarly publications. Here’s the top from their “about” page: making things citable is the first thing the list. (my emphasis):

figshare is a repository where users can make all of their research outputs available in a citable, shareable and discoverable manner.

But all those hopes can be dashed by a single sentence from a journal. “Nope, we only take peer reviewed papers.”

Interestingly, the journal let us keep in a couple of conference abstracts. The abstracts were in published in a peer-reviewed society journal, but the abstracts themselves were not peer-reviewed.

I’ve noted before that some journals have had a tradition of allowing people to cite “grey literature,” like conference abstracts, newspaper articles, web pages, or tweets. I think that is a positive thing. I worry that this sort of “journals citing (and therefore promoting) journals” policy might become more common as journals compete for scientific products. Such policies could hamper the development of academic publishing innovations.

Even worse, a “journal only” policy has the potential to force authors into intellectual dishonesty. “We got this idea from another lab’s data on figshare, but we can’t say that in the paper, so we’ll just have to say something else.”

Authors and possibly reviewers should be determining what is a legitimate citation on a case by case basis, rather than a journal setting a blanket policy favourable to itself.


Mike Taylor said...

The journal copy editors wrote: "Only peer-reviewed references are permitted in the reference list."

As soon as I read that, I thought to myself "I bet that's not true". And sure enough, a few lines later you mentioned the non-peer-reviewed abstracts that they accepted.

So the journal editors were, first, lying (or let us more charitably say they were mistaken).

But if they'd been correct, they would still have been wrong. Because that policy makes no sense at all. True facts are true, wherever they're from. And false statements crop up all the time in peer-reviewed literature. Anyone who thinks they can use the peer-reviewed status of a publication as a badge of quality is at best naive, at worst outright deluded.

Pretty much every journal allows the citation of pers. comms. Perhaps you should have cited the shark finding as "J. Hyneman and A. Savage, pers. comm., 2008". Admittedly, they personally communicated it to a few million other people at the same time, but that's neither here nor there.


As an aside: my chapter on the history of sauropod research, in the Geological Society's 2010 book Dinosaurs and Other Extinct Saurians: a Historical Perspective included the following reference, which the reviewers and editors seems cool with:

Chapman, G.& Cleese, J. 1989. Anne Elk's Theory on Brontosauruses. In: Chapman, G., Cleese, J., Gilliam, T., Idle, E., Jones,T. & Palin, M. (eds) Just the Words, Volume 2. Methuen, London, 118-120.

Heinrich Mallison said...

Idiotically, practically any journal in the world will accept "pers. obs." or "own unpubl. data". Gargh!

Mike Taylor said...

Right. As in the infamous Note 20 of Stevens and Parrish 1999 -- the unpublished (and still unpublished 16 years later) data that is fundamental to their whole approach. Still, you can't expect high standards from Science, can you?

Heinrich Mallison said...

Mike, what do you expect of a journal that published extended abstracts only? ;)

Mike Taylor said...

We are after all talking about a journal that routinely relegates the description part of descriptive papers to the footnotes. See for example Note 5 of Sereno et al. 1999. In a another decade we're going to look back and wonder how we ever thought publications like Science could be suitable venues for actual work.