Who's afraid of open peer review?
Celebrating Open Access Week 2014 affords an opportunity to study and promote all aspects of ‘Open.’
Send us a link
Celebrating Open Access Week 2014 affords an opportunity to study and promote all aspects of ‘Open.’
We discuss the views and experiences of our Editorial Board Members towards open peer review on this biomedical journal.
Pros and cons of an alternative for today’s method of allocating research funds using peer review.
Welcome efforts are being made to recognize academics who give up their time to peer review.
Recent retraction of two papers on stem-cell research by the journal Nature highlights weaknesses in this self-regulatory framework that scientists need to address.
The release of the 2014 Impact Factor Report was being awaited, as usual, with some anticipation. Yet this comes at a time when there is an ever-rising tide of contestation about its value in a radically changing research environment, especially in the developing world.
Commenters on post-publication peer review sites such as PubPeer are catching errors that traditional peer reviewers have missed.
The changing nature of research evaluation in UK higher education is creating perverse and damaging consequences that reinforce an excessively narrow definition of what counts as "high-quality" research.
Anonymity of authors as well as reviewers could level field for women and minorities in science.
The rate of retractions of scientific papers has been growing over the past decade, suggestive to some of a crisis of confidence in science. Can we no longer trust the scientific literature?
Nature, the pre-eminent journal for reporting scientific research, has had to retract two papers it published in January after mistakes were spotted in the figures, some of the methods descriptions were found to be plagiarised and early attempts to replicate the work failed.
In an era of large collaborations, multi-authored papers, and enormous datasets, is there still room for the single creative idea that proves to be a gamechanger?
The Winnower is another open access online science publishing platform that employs open post-publication peer review, aiming to revolutionize science by breaking down the barriers to scientific communication through cost-effective and transparent publishing for scientists.
Scientists make much of the fact that their work is scrutinised anonymously by some of their peers before it is published. This "peer review" is supposed to spot mistakes and thus keep the whole process honest.
Most academic papers today are published only after some academic peers have had a chance to review the merits and limitations of the work. This seems like a good idea, but there is a growing movement that wants to retort as Albert Einstein did to such a review process.
Academics have internalised research assessment to such a degree that the effects may be irreversible.
Scientists are asked to comment on static, final, published versions of papers, with virtually no potential to improve the articles. This is the state of post-publication peer review today.
Comment on a recent Nature blog entry by Richard Van Noorden
Felipe Fernández-Armesto bristles at the stifling effect of peer review.
New scientists have grown up commenting on their friends pictures, their silly comments on Facebook and their favorite YouTube videos. Will this practice carry over into their scientific publishing?
Science is now able to self-correct instantly. Post-publication peer review is here to stay.
A platform comparing research journal's performance aiming to make the peer review process more efficient.
Scientific publishing is under the spotlight at the moment. Is it time for change?
Peer review, many boffins argue, channelling Churchill, is the worst way to ensure quality of research, except all the others. The system, which relies on papers being vetted by anonymous experts prior to publication, has underpinned scientific literature for decades.
Controversial model points to benefits of more opinionated reviews.
Abstract: A semi-supervised model of peer review is introduced that is intended to overcome the bias and incompleteness of traditional peer review. Traditional approaches are reliant on human biases, while consensus decision-making is constrained by sparse information. Here, the architecture for one potential improvement (a semi-supervised, human-assisted classifier) to the traditional approach will be introduced and evaluated.
Report quality is significantly higher on the open peer review model for questions relating to comments on the methods and study design, supplying evidence to substantiate comments and constructiveness.
Research repository launches comment platform for post-publication peer review.
Peer review is one of the oldest and most respected instruments of quality control in science and research. Peer review means that a paper is evaluated by a number of experts on the topic of the article (the peers). The criteria may vary, but most of the time they include methodological and technical soundness, scientific relevance, and presentation.