27 June 2021

The paradox of MDPI

One of the most puzzling trends in scientific publishing for the last couple of years has been the status of the open access publisher MDPI.

On the one hand, some people I know and respect have published their papers there. I’ve reviewed for some journals, and have seen that authors do make requested changes and there is some real peer review going on.

On the other hand, few other publishers today seem so actively engaged in pissing off the people they work with. Scientists complain about constant requests to review, particularly in areas far outside their domain expertise – an easily avoided and amateurish mistake. 

And MDPI’s boss seems like a dick.

A few people have been trying to make sense of this paradox. Dan Brockington wrote a couple of analyses over the last two years (here, here) that were broadly supportive of what MDPI has done.

Today, I stumbled across this post by Paolo Crosetto that takes a long view of MDPI’s record. It prompted another analysis by Brockington here.

Both are longish reads, but are informed by lots of data, and both are nuanced, avoiding simple “good or bad” narratives. I think one of the most interesting graphs is this one in Crosetto’s post on processing turnarounds:

Graph of time from submission to acceptance at MDPI journals.  2016 shows wide variation from journal to journal; 2020 data shows little variation.

There used to be variation in how long it took to get a paper accepted in am MDPI journal. Now there is almost no smear how long it takes to get a paper accepted in an MDPI journal. That sort of change seems highly unlikely to happen just by accident. It looks a lot like a top down directive coming from the publisher, putting a thumb on the decision making process, not a result of editors running their journals independently.

Both Crosetto and Brockington acknowledge that there is good research in some journals. 

The questions seems to be whether the good reputation is getting thrown away by the publisher’s pursuit of more articles, particularly in “Special Issues.” Crosetto suspects the MDPI is scared and wants to extract as much money (or “rent” as he calls it) from as many people as fast as possible. Brockington says that this may or may not be a problem. It all depends on something rather unpredictable: scientists’ reactions. 

Scientists may be super annoyed by the spammy emails, but they might be happier about fast turn around times (which people want to an unrealistic degree) with high chance of acceptance. 

If the last decade or so in academic publishing has taught us anything, it’s that there seems to be no upper limit for scientists’ desire for venues in which to publish their work.

PLOS ONE blew open the doors and quickly became the world’s biggest journal by a long ways. But even though it published tens of thousands of papers in a single year, PLOS ONE clones cropped up and even managed to surpass it in the number of papers published per year. 

MDPI is hardly alone in presenting bigger menus for researchers to choose where to publish. Practically every publisher is expanding its list of journals at a decent clip. I remember when Nature was one journal, not a brand that slapped across the titles of over 50 journals.

MDPI is becoming a case study in graylisting. As much as we crave clear categories for journals as “real” (whitelists) or “predatory” (blacklists), the reality can be complicated.

Update, 1 July 2021: A poll I ran on Twitter indicates deep skepticism of MDPI, with lots of people saying they would not publish there.

Would you submit an article to an MDPI journal?

I have done: 9.4%
I would do: 3.9%
I would not: 50%
Show results: 36.7%

Update, 21 August 2021: A new paper by Oviedo-García analyzes MDPI’s publishing practices. It makes note of many of the features in the blog posts above: the burgeoning number of special issues, the consistently short review times across all journals. Oviedo-García basically calls MDPI a predatory publisher.

This earned a response from MDPI, which unsurprisingly disagrees.

Update, 7 March 2022: Mark Hanson lays out more issues with MDPI in this Twitter thread. A few point that he brings forth that I have not seen before:

Articles in MDPI journals have unusually high numbers of self-citations.

His blog post is also worth checking out.

Update, 13 October 2023https://twitter.com/eggersnsf/status/1557273726571487232 shows data that shows MDPI is shooting ahead of Frontiers, PLOS, and Hindawi in terms of articles published. So clearly they are offering some sort of service that people want.

Update, 10 November 2022: Dan Brockington has an updated analysis of MDPI journals.

External links

An open letter to MDPI publishing

MDPI journals: 2015 to 2019

Is MDPI a predatory publisher?

MDPI journals: 2015 to 2020 

Oviedo-García MÁ. 2021. Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI). Research Evaluation: in press. https://doi.org/10.1093/reseval/rvab020

Comment on: 'Journal citation reports and the definition of a predatory journal: The case of the Multidisciplinary Digital Publishing Institute (MDPI)' from Oviedo-García

Related posts 

My resolve not to shoot the hostage is tested

Graylists for academic publishing

25 June 2021

What the American furore over critical race theory means for science education

People shouting at school board meeting
 

Well, that escalated quickly.

In the last few months, “critical race theory” has gone from an academic idea mostly (but not exclusively) taught in law school to a flashpoint item that American right wingers are willing to go to the mattresses over. 

There have been ample shouting in school board meetings.

Nikole Hannah-Jones was denied tenure after winning a Pulitzer prize and a McArthur award for the 1619 Project. Not because her colleagues voted against her, but because the normally quite Board of Trustees voted against her.

Critical race theory is not my area of expertise (please forgive me, social science and humanities scholars, for musing here), but watching this feels very much like something I am familiar with and have watched closely for a long time: the fight over teaching evolution.

I noted on Twitter that a lot of the same forces that traditionally mobilize against the teaching of evolution are now mobilizing against the teaching of critical race theory. Laws are successfully getting passed in state legislatures. And while one side is willing to show up in large numbers and shout at school boards, the other... has not been so active.

There’s a lot of reasons for the lopsided reaction, bit I can’t help but wonder if people who are not freaking out over critical race theory are not bringing the same fire to the fight is because they look at the laws that are being passed, think, “That’s going to court and the laws will get overturned.”

A few things make this sound plausible on the face of it. 

Critical race theory isn’t taught in K-12 schools, so prohibiting the teaching of something that isn’t taught is bizarre.

But more to my point here, this is what happened with the teaching of evolution. There was often widespread political and public support for laws that tried to regulate the reaching of evolution. But time and again in the US, the courts have said, “No, those anti-evolution laws are unconstitutional.”

I do not feel extremely confident that an American court case would strike down the laws prohibiting critical race theory. 

While regulating or banning the teaching of evolution and banning the teaching of critical race theory are both pet projects of the political right in the US, the arguments over evolution are inextricably intertwined with religious beliefs. A government promoting particular religions (mainly fundamentalist Protestant Christianity) runs afoul of the establishment clause of the first amendment of the US constitution. That has been a key legal aspect of all the cases.

The opposition to critical race theory is not as clearly driven by religion, which makes the legal issues very different. Even if the end result – states dictating what can and cannot be taught – looks very much the same.

This is why academics of any stripe, including my fellow scientists, need to be paying very close attention to how these fights over critical race theory play out. 

University instructors have mostly taken it for granted that they can teach subjects as they see fit. The fights over critical race theory are a test case.Climate science is another area that many on the right would probably love to dictate how it is taught. Ditto anything around sexual and gender diversity. 

If those laws are not smacked down fast and hard, whether in the courts or by political and public action, this could be the start of a sustained squeeze on how universities teach in the US. And America’s reputation for its universities in the world will suffer.

Picture from here.

06 June 2021

The week of IAmSciComm, 6 June 2021!

I have just taken over the reigns of the @IAmSciComm rotating curator Twitter account! This is my second time hosting, and am gratified to be asked back.

Here is a rough schedule for the week.

Monday, 7 June: Show me a poster, graphic, or dataviz!   Tuesday, 8 June: Why streaks matter!  Wednesday, 9 June: From blog to book!   Thursday, 10 June: Posters for everyone!   Friday, 11 June: Posters reviewed!  Saturday, 12 June: The randomizer!

  • Monday, 7 June: Show me a poster, graphic, or dataviz! 
  • Tuesday, 8 June: Why streaks matter!
  • Wednesday, 9 June: From blog to book! 
  • Thursday, 10 June: Posters for everyone! 
  • Friday, 11 June: Posters reviewed!
  • Saturday, 12 June: The randomizer!

Join me, won’t you?

Related posts

The IAmSciComm threads

External links

IAmSciComm home page

04 June 2021

Experiments doesn’t always lead to papers

Drugmonkey tweeted

Telling academic trainees in the biomedical sciences to put their heads down, do a lot of experiments and the papers will just emerge as a side-product is the worst sort of gaslighting.

He’s right, as he often is, but it bears examining why that’s the case.

First, not all experiments should be published. Experiments can have design flaws like uncontrolled variables, small sample sizes, and all the other points of weakness that we learn to find and attack in journal club in graduate school.

Second, even if an experiment is technically sound and therefore publishable in theory, it may not be publishable in practice. In many fields, it’s almost impossible to publish a single experiment, because the bar for publication is high. People usually want to see a problem tackled by multiple experiments. The amount of data that is expected in a publishable paper has increased and probably will continue to do so.

The bar for what is considered “publishable” is malleable. We have seen that in the last two years with the advent of COVID-19. There was an explosion of scientific papers, many of which probably would not have been publishable if there wasn’t a pandemic going on. People were starved for information and researchers responded in force. You have to understand what is interesting to your research community.

Third, experimental design is a very different skill from writing and scientific publication. 

Fourth, it’s not a given that everyone feels the same drive to publish. Different people have different work priorities. For instance, I saw a lot of my colleagues who had big labs groups with a lot of students who churned through regularly. Those labs generated a lot of posters and a lot of master’s theses. According to our department guidelines, theses were supposed to represent publishable work.

But all of that didn’t turn into papers consistently. I think people got positive feedback for having lots of students (and looking “busy”) and came to view “I have a lot of students” as their personal measure of success. Or, they just got into the habit of thinking, “I’ll write up that work later.” “Or just one more experiment so we can get it in a better journal.”

I had fewer students and master’s theses written than my colleagues, but I published substantially more papers. I say this not to diss my colleagues or brag on myself, but it’s a fact. I made publication a priority.

Publishing papers requires very intentional, deliberate planning. It requires understanding the state of the art in the research literature. It requires setting aside time to write the papers. It requires understanding what journals publish what kinds of results. Just doing experiments in the lab will not cause papers to fall down like autumn leaves being shaken loose from trees.