Hoax Highlights Publishing Problem

Bogus research papers get accepted by scores of open-source academic journals

By Mark Rosenzweig, Editor in Chief

Share Print Related RSS

1311 fake research buttonScientific papers based on taxpayer-funded research are becoming more readily available, thanks to various initiatives to spur open access to such information, as I noted last month ("America Opens Up to Open Access"). However, what a Harvard researcher discovered in a "sting operation" should give readers of at least some open-access journals pause. He sent spoof papers to hundreds of such journals; far more accepted the bogus papers than rejected them.

Many open-source journals didn't assess the quality of papers adequately.


Peer review long has served as the bulwark for maintaining the quality of papers published in academic journals. The sting showed that many open-access journals, even those claiming to peer review submissions, don't assess the quality of papers adequately.

The growing push for open-access publishing has prompted a proliferation in the number of such journals. Many of these derive their income from fees paid by authors rather than from subscriptions, so they have an incentive to accept as many papers as they can. Worse, some are put out by what Jeffrey Beall of the University of Colorado, Denver, calls "predatory publishers," firms with questionable practices. (Access his list of predatory publishers.)

In the sting, John Bohannon, a visiting researcher at Harvard and contributing correspondent for Science, sent out 304 versions of a paper on a "wonder drug," about ten a week from January through August 2013 — including 137 to publishers on Beall's list. He used a variety of made-up author and institution names that implied the papers came from Africa, and deliberately crafted the text so it appeared written by someone who's not a native English speaker.

The bogus papers all discussed a molecule X from a lichen species Y inhibiting the growth of a cancer cell Z; while the X, Y and Z differed in the papers, the scientific content was identical. "The goal was to create a credible but mundane scientific paper, one with such grave errors that a competent peer reviewer should easily identify it as flawed and unpublishable," he notes. A summary of the sting was published in early October in Science (www.sciencemag.org/content/342/6154/60.full). When Bohannon wrote that piece, 157 journals had accepted the papers, while 98 had rejected them.

"Of the 106 journals that discernibly performed any review, 70% ultimately accepted the paper. Most reviews focused exclusively on the paper's layout, formatting and language. This sting did not waste the time of many legitimate peer reviewers. Only 36 of the 304 submissions generated review comments recognizing any of the paper's scientific problems. And 16 of those papers were accepted by the editors despite the damning reviews," he explains.

Acceptance by predatory publishers wasn't a surprise. However, Bohannon notes: "The paper was accepted by journals hosted by industry titans Sage and Elsevier. The paper was accepted by journals published by prestigious academic institutions such as Kobe University in Japan. It was accepted by scholarly society journals. It was even accepted by journals for which the paper's topic was utterly inappropriate…"

Critics complain that because the sting was aimed only at open-access journals, it doesn't offer insights on the performance of such journals versus traditional ones. A more useful approach, they argue, would have been to send the spoof articles to traditional journals as well, and then to compare acceptance/rejection rates. That certainly would have provided a better perspective on whether the problem primarily afflicts open-source journals or undermines journal publishing in general.

Unfortunately, the sting does suggest a new meaning for the term "peerless" at least when it comes to journal articles.


rosenzweigweb.jpgMARK ROSENZWEIG is Chemical Processing's Editor in Chief. You can e-mail him at mrosenzweig@putman.net

Be sure to check out his page.

Share Print Reprints Permissions

What are your comments?

You cannot post comments until you have logged in. Login Here.

Comments

  • This might have been a good moment to share with us, your readers, how you review articles submitted for publication. Anyway, good magazine. I read several articles a month.

    Reply

  • You raise a good point.

    We do not use peer review. Rather I make the decisions about manuscripts based on my chemical engineering background and more than 40 years of experience in technical publishing. In rare instances, I may ask a member of CP's Editorial Board for an assessment. (The Board Members are chemical engineers involved in a variety of functions and in several segments of industry.)

    I judge the papers on whether they provide impartial practical guidance, or if a case history, whether they cover an interesting or innovative use of technology at a particular plant -- in contrast to focusing on the originality and value of research, which is what peer reviewers typically assess.

    Reply

RSS feed for comments on this page | RSS feed for all comments