This time, we have an article that appeared a few weeks ago from New Scientist. It’s just been picked up in The Washington Post and a Popular Science blog. The piece features the work of Robert Elwood, Robyn Crook (both of who have appeared on the blog before) and Hans Smid.
It’s an extremely good article that captures a lot of the nuances of the issue. I was surprised, though, to read this paragraph describing Elwood’s work:
(Elwood) started with prawns. After so many years of working with them, he thought he knew what to expect, which was that he would see nothing more than reflex reactions. But to his surprise, when he brushed acetic acid on their antennae, they began grooming the treated antennae with complex, prolonged movements of both front legs. What’s more, the grooming diminished when local anesthetic was applied beforehand.
Elwood is describing work he co-authored back in 2007 (Barr et al., 2007). What this fails to mention is that this effect could not be replicated in other species (Puri and Faulkes 2010). Elwood knows this; he’s mentioned the failure to replicate in his academic papers (Elwood 2011, page 177):
One recent study, however, failed to detect nociceptors in decapod crustaceans and also noted little ability to respond to noxious stimuli (Puri and Faulkes 2010) despite organized responses to noxious chemical and electrical stimuli noted in other studies (Barr et al. 2007; Elwood et al. 2009; Elwood and Appel 2009).
I’m curious as to whether this was mentioned during the interview, or was removed to get the article to the appropriate word count.
How should scientists, and reporters, discuss work that has failed to replicate? The original Barr and colleagues article remains in the scientific literature; failed replication alone is not grounds for retraction. In that sense, the paper by Barr and company has not been definitively shown to be flawed or wrong. But being able to replicate effects is an acid test (no pun intended) for determining if those effects are real and robust.
Reporters often go after quotes from outside experts; “scientists who were not involved in the study.” This is good practice, and represents due diligence on the part of the reporter to find potentially critical viewpoints. But these external experts are often working from their knowledge of the field, and are generating critiques from theory and first principles. It’s not common for them to have done any experiments.
If it’s good practice to get responses from other researcher making criticisms with no data, surely it’s even better practice to talk about criticisms that are backed up with data published in peer reviewed journals?
The paragraph I quoted above also does not mention an interpretive problem. “Grooming” is the behaviour that is used to indicate nociception and/or pain in the shrimp. Acid causes an increase in grooming. Local anesthetic applied after acid reduces grooming. So far, so good.
What would you predict if you applied anesthetic alone?
The anesthetic should not be noxious – it is supposed to reduce sensory input, after all – so you might predict to see no increase in grooming compared to control.
But applying anesthetic alone causes an increase in grooming (Figure 1 in Barr et al. 2007):
I don’t know what to make of this, except that it complicates the story. The shrimp are clearly detecting the anesthetic and responding to it in the same way as the noxious stimuli: increased grooming.
Additional: Author Tamar Stelling replied on Twitter:
Elwood clear. Still: prolonged groom. @ 3 diff. decapod subject to 3 diff. treatm. Might-not-feel-pain component already obvious.
Right, it’s more complicated than story can show. Chose to focus on evidence of Elwood’s total research, not one experiment.
I disagree with Stelling’s characterization of this being “one experiment.” The entire first paper by Barr and colleagues hinges on the notion that that acids and bases cause crustaceans to groom, and that is indicative of nociception.
Stelling’s answers point out the tension between a good narrative and scientific processes. For a journalistic narrative, uncertainty is uncertainty. There are no need for different gradations of uncertainty. The author of a study going, “It could be something else,” outside experts saying, “ don’t think so,” and sitting down and actually running the entire experiment again and not getting it to work all serve the same purpose in the narrative: saying, “maybe not.”
Scientifically, those forms of uncertainty are not at all equivalent.
Additional, 18 March 2014: Bio News Texas also reports on Crook and Elwood’s work. The article is apparently inspired by, but not a duplicate, of the New Scientist article.
Additional, 25 March 2014: Mike Taylor runs with this, and suggests new kinds of citations.
Barr S, Laming PR, Dick JTA, Elwood RW. 2007. Nociception or pain in a decapod crustacean? Animal Behaviour 75: 745-751. http://dx.doi.org/10.1016/j.anbehav.2007.07.004
Elwood RW. 2011. Pain and suffering in invertebrates? ILAR Journal 52: 175-184. http://dels-old.nas.edu/ilar_n/ilarjournal/52_2/html/v5202Elwood_abstract.html
Puri S, Faulkes Z. 2010. Do decapod crustaceans have nociceptors for extreme pH? PLoS One 5: e10244. http://dx.doi.org/10.1371/journal.pone.0010244
What we know and don’t know about crustacean pain
Crustacean pain is still a complicated issue, despite the headlines
Squished squid, or: noci-ceph-tion
Tenth International Congress for Neuroethology, Day 5
Don invertebrates feel pain?
Do lobsters and other invertebrates feel pain? New research has some answers.
Squids and other invertebrates can probably feel pain
Photo by Twistiti on Flickr; used under a Creative Commons license.