http://www.fraserinstitute.org/research-news/news/display.aspx?id=11513
"Empirical research in what are commonly called ‘peer-reviewed’ academic journals is often used as the basis for public policy decisions, in part because people think that ‘peer-review’ involves checking the accuracy of the research. That might have been the case in the distant past, but times have long since changed. Academic journals rarely, if ever, check data and calculations for accuracy during the review process, nor do they claim to. Journal editors only claim that in selecting a paper for publication they think it merits examination by the research community.
But the other dirty secret of academic research is that the data and computational methods are so seldom disclosed that independent examination and replication has become nearly impossible for most published research."
Last fall, Bayer, Amgen and a venture capitalist all blew the whistle on the sorry state of academic research. The vast majority of published studies, even in the 'best' journals is flawed. It can't be replicated. See e.g. http://www.reuters.com/article/2012/03/28/us-science-cancer-idUSBRE82R12P20120328 and http://online.wsj.com/article/SB10001424052970203764804577059841672541590.html and
http://lifescivc.com/2011/03/academic-bias-biotech-failures/#0_undefined,0
Science has published a policy paper suggesting it is finally time for academic journals to recognize that data and code must be available for any study that is published. http://thegwpf.org/science-news/5474-at-last-the-right-lesson-from-climategate-fiasco-.html
It is 2012. A long time from the day in 2004 when prominent climate scientist Phil Jones said to Warwick Hughes, "Why should I make the data available to you, when your aim is to try and find something wrong with it?"
Given all the stonewalling in climate science, this would be a breath of fresh air. 'Science' without transparency and replication is no more scientific than voodoo.
No comments:
Post a Comment