What would that strong support be? According to this story, it’s whether authors have published a scientific paper about media violence.
Read that carefully. It’s not about scientific papers on media violence, it’s about the authors of scientific papers on media violence.
Here’s the deal. There’s a court case. California wants to be able to ban the sale of video games to people under 18 based on the violence content. People get to file amicus briefs to offer their opinions for the court to consider. The authors of this study decide to look for evidence that video games cause violence by examining the scientific credentials of who wrote the court briefs.
They compared whether the amicus brief authors had published papers about media violence. Again, they’re not saying anything about the papers, just whether a brief author has written any in that area. In theory, someone who published a study on media violence that showed no effect, but who argued in favour of the proposed laws, you would be counted on that side, even if your research didn’t support a link. (Admittedly, that seems unlikely.)
In both sides of the case, authors with published scientific articles on media violence are in the minority. Those arguing the side claiming that violent video games are not problematic have a smaller percentage.
Still, most of the people writing briefs don’t have scientific expertise on the subject, which in and of itself is worrying.
But is comparing the percentages valid at all? Who decides who gets to submit amicus briefs? If briefs are submitted voluntarily, there could be any number of biases in the generation and selection of briefs.
Another piece of evidence that is considered “strong support” is by analyzing the impact factor of the journals the brief authors have published in. Impact factors have many problems, but their use here is weird. Again, the authors are not examining the impact factors of journals that published articles on media violence (as far as I can see), but whether the amicus author had published in high impact journals, ever.
Sorry, but that is not “Strong support for the argument that video game violence is indeed harmful.” It’s barely support at all. If they had said, “Supporters for laws limiting violent video games have more expertise than those opposing such laws,” there would be no problem.
It would have been better to look for peer-reviewed articles cited in the briefs. Than start rating those articles for the quality of their evidence, using basic criteria like:
- Is the paper actually about video games? (I.e., is it relevant?)
- Was it a randomized, double blind experiment?
- How big was the sample size?
- How big was the effect size?
- Has the finding been replicated?
- How often has the paper been cited?
Credentials are important, but they shouldn’t be a substitute for evidence.
Caveat! The paper that this research will be described in will not be published in May. It is possible that the actual research is better than the story in Science Daily (which I’ve been baffled by before).
Hat tip to Julie Dirksen.
No comments:
Post a Comment