Wikis have been a hot topic this year, and most of the stories about them have been both positive and optimistic. Everybody likes wikis, a lot of individuals (and businesses and organizations) want them. But a recent open peer review experiment at the scientific journal Nature has failed, and some people are now second-guessing the model.
Nature was surprisingly open about how poorly the trial had gone. “The intention was to explore the interest of researchers in a particular model of open peer review, whether as authors or as reviewers,” a Nature article stated. “It was also intended to provide Nature’s editors and publishers with a test of the practicalities of a potential extension to the traditional procedures of peer review.”
It turns out, however, that it might not be practical at all. Nature “sent out a total of 1,369 papers for review during the trial period,” but only “the authors of 71 (or 5%) of these agreed to their papers being displayed for open comment. Of the displayed papers, 33 received no comments, while 38 (54%) received a total of 92 technical comments. Of these comments, 49 were to 8 papers.”
A Techdirt reader raised an objection that seemed valid: “It’s because NO ONE KNEW ABOUT IT!!” “Posterlogo” then wrote that “this had nothing to do with the scientific community failing to embrace open access peer review. It had EVERYTHING to do with not getting the word out well enough.”
The Nature article addressed that issue, though, saying, “The trial received a healthy volume of online traffic: an average of 5,600 html page views per week and about the same for RSS feeds. However, this reader interest did not convert into significant numbers of comments.”
The failed experiment at Nature does not mean that wikis (and/or the peer review model) are “bad,” of course, but it’s a clear indicator that this trend may not be well-suited to act as a bandwagon.
—
Tag:
Add to Del.icio.us | Digg | Reddit | Furl
Doug is a staff writer for Murdok. Visit Murdok for the latest eBusiness news.