Collaborative filtering no peer review
A recent experiment by Nature
magazine in opening its peer review process to all interested parties has turned out to be somewhat of a dud
. Though not conclusive, the results might be useful when thinking about other forms of open source development as well.
Like most respected scientific journals, Nature
has long used the peer review process to evaluate the quality of the scientific papers submitted for publication. In this process, a select number of anonymous scientists review the papers. The reviewers, who work with or close to the topic their papers cover, check the accuracy and relevancy of the work. Those papers that meet their approval get published, those that don't get sent back for revision.
Last June, Nature
tried a new technique for review. It posted a select number of papers for public comment, in addition to sending them through the usual review process. In these cases, anyone could comment on the papers. The journal's editors wanted to see what kind of feedback the material would get. Would the public at large (or at least other trained professionals) offer insites the traditionally closed review process would not supply?
Turns out, probably not. Of the 71 papers posted on the Web, only 38 received comments. And the vast majority of comments (49 out of a total 92) were for only eight papers. Moreover, the comments were of little value. The journal concluded that '[D]espite enthusiasm for the concept, open peer review was not widely popular, either among authors or by scientists invited to comment.'
The results, while admittedly informal, go against the popular notion that an organization could use the worldwide reach of the Web to attract extra muscle. While the Internet certainly lowers the physical barrier to participation, other factors still could impede collaboration, including level of specialization required, organizational boundaries and plain-old time constraints. Most trained personnel 'are too busy, and lack sufficient career incentive, to ' post public, critical assessments of their peers' work,' the Nature
editors concluded in an editorial
on the findings.
A cynical observer could extend this lesson to other highly-touted methods of getting the public involved in your pet project
. Harnessing outside help certainly is a core element in open source software
, collaborative editing
and other Web 2.0
Come to think of it, proportionally speaking, the large portion of Nature
papers that went uncommented on certainly resembles the large the number of stagnant open source projects on SourceForge
, the most popular repository for open source software. While a handful of projects, such as the Gaim
instant messaging client and the most excellent FileZilla
FTP client, get plenty of developer love, many if not most of the other 137,000 projects on the site age with neglect.
If nothing else, Nature
's test reminds those embarking on public collaborations that they should have a pretty clear idea of who their participants would be and how likely these people are to actually contribute.
Posted by Joab Jackson on Dec 22, 2006 at 9:39 AM