Posts Tagged ‘peer review’

The Google Journal

January 29, 2010

Google wants to make the world’s knowledge readily available to everyone. They are trying to put all books that have ever been published into a freely available, searchable digital library. Google Earth is trying to capture as many details of our physical surroundings as possible. And so on.

But what about new knowledge: new discoveries by scientists and new theories created by academics in the world’s universities? Traditionally these are published in academic journals which then sit in university libraries for other academics to read, criticize or build on, and may eventually find their way into text books, courses and popular books, and so to a wider audience.

The academic journal system is far from ideal in many ways. The journals are usually expensive to consult for those outside the academic system, which means in practice they are only easily available to those working in universities. (It is true that there are a growing number of open access journals which are open to all, but these are still the exception rather than the rule.) There is typically a longish delay between submitting an article and getting it published, particularly if, as is often the case, the author has to submit to a number of journals one after the other – simultaneous submissions not being allowed. The system for trying to ensure that only good paper are published – so called peer review – is fraught with problems such as unreliability, the possibility that an author’s pet theories might be pinched by anonymous peer reviewers, and the hostility to new ideas likely with editors and peers being steeped in the prejudices of their discipline. And the multiplicity of different journals, each with their own format and outlet, makes life complicated for both authors and readers. And mistakes are made – things are published which probably shouldn’t be, and vice versa.

This scenario is almost begging for Google to move in. It wouldn’t, of course, need to be Google. But it needs to be an organization that can think big and bold, and I will use the name Google for such an organization.

What could Google do? They could simply create the Google Journal for publishing academic and scientific papers. Then authors could submit their papers, and after a few checks to ensure that the paper are in an acceptable format, they would be published in the GJ. This is a similar system to that used in various repositories of papers such as the Arxiv at Cornell University (http://arxiv.org/) and the Social Science Research Network (http://www.ssrn.com/). However, these are all subject specific, and often require authors to specify whether their offering is physics or philosophy or paleontology.

But what about peer review and quality control? How can readers be sure that what they read has been vetted by the gatekeepers of academia?

Google could encourage third parties to take this on. Take the Bungee Jumping Science Association. At the moment they publish the respected Annals of Bungee Jumping, but this is expensive. Furthermore libraries are increasingly unwilling to pay the subscription, and prospective authors and readers often stray to rival publications (Suicide Studies and The Rubber Review). All they would have to do is to transfer the allegiance of writers and readers to the Google Journal, and set up their own Journal Quality Certification Scheme. Authors would submit their papers to the GJ, and then apply to have them certified. If as usually happens, the BJSA insists on revisions to the original article before awarding their stamp, the certified paper would be a new entry in the GJ – linked to the original so that interested readers can see the impact of the BJSA certification process on the original paper. Then readers go to the GJ and search for papers with the BJSA quality stamp.

But why bother? Isn’t the result just like before? Well … yes and no. If the BJSA applies the same peer review procedures as before, and the revised versions of the papers incorporating improvements suggested by their referees are put in the GJ, then things could work as before.

Except that there would be advantages. Big advantages. The GJ would be open access and the BJSA would only have the costs associated with the reviewing and certification process – they could cover this by charging readers for the certification list, or authors for the privilege of being certified, or from their membership charges. From the perspective of the BJSA it would be cheaper and easier.

From the perspective of readers it would mean that everything – the BJSA papers, the papers certified by Suicide Studies and the Rubber Review would all be in one place and one format making it easier to find the latest and best work. This is exactly the principle that makes the web such a useful source of information about anything.

Authors would also like the new system. No more searching to find the best journal that is likely to accept their work. And then having to reformat it when the first journal turns it down and they try another. They put it on the Google Journal and apply to a number of quality certifiers. Multiple stamps of approval would be no problem, although how the system might evolve is difficult to predict.

One possibility is that the peer review approval stamp might be a little more specific than current peer review practices. If an article is in a peer reviewed journal we don’t know if it’s been checked for the quality of the writing, the accuracy of the citations to other work,  the correctness of the statistical analysis, or simply consistency with the editor’s prejudices. With the new system, there might, for example, be a statistical stamp of approval, and readers would be able to see if research on climate change or the MMR vaccine lacked such statistical certification. On the other hand, articles exploring possibilities, as yet unproven, might be subject to different, perhaps less stringent, criteria.

From the perspective of encouraging the growth and dissemination of knowledge there would also be advantages. The new system should be quicker, and opening up the possibility of multiple quality certifiers should help to encourage new ideas, and enhance the diversity of the offerings available. In the language of economics, costs to producers and consumers should be reduced and the market for ideas would become more flexible, competitive and efficient. Competition between different definitions of “good” research would be easier than under the present system. One possibility might be a certificate based on review by non-peers, or outsiders from other disciplines, as a way of countering the introspective, and often increasingly bizarre, evaluation criteria used by some academic disciplines