Editorial - 27 January 2010
To view previous Editorials click here


 


Excellence in Research

for Australia (AU)

Bad Ideas Whose Time is Nye

 

Research Excellence

Framework (UK)

 

pdf file-available from Australasian Science

 

 

Colin Macilwain's column in the January 20 issue of Nature "Wild goose chase" opens with the observation: "quantitative research assessment is a bad idea whose time has come," and suggests that "The Research Excellence Framework could simply pit discipline against discipline in a race to tell the tallest tales".

 

As numbers before him, he points out that "Successive Research Assessment Exercises (RAEs) at British universities have been widely lauded, at home and abroad, for helping to raise the quality of research". The first took place in 1986, followed by RAEs in  1989, 1992, 1996, 2001 and with the results of the most recent published in December 2008. However, what has not been assessed is the relative effect on research excellence by successive Research Assessment Exercises compared to that of the input of increased resources to the university research sector for the period. When one considers that the last RAE is estimated to have cost the UK £60 (A$108) million, and that federal research funding in the United States does not rely on such an additional layer beyond peer review of grant proposals, something more than the assumption that the RAE was the dominant driving force for improved research is called for. So far that's not been acknowledged.

 

 

    Credit: C. Macilwain, DOI 10.1016/j.cell.2009.10.041

    Note: The Blair Labour government assumed office in May 1997.

 

In an article Mr Macilwain published in Cell this past November (DOI 10.1016/j.cell.2009.10.041) he pointed out: "A 20 year quantitative assessment of research in British universities has coincided with a renaissance in their international status. But even as other nations seek to emulate the approach, domestic critics are calling into question its dominance over university life." In his view: "The biggest lesson to be drawn from the UK RAE, which has been conducted six times since 1986, is social scientists' equivalent of the uncertainty principle: such exercises influence the behaviour of the observed, often in unforeseen ways. Whatever is measured becomes emphasized, probably at the expense of whatever is not."

 

According to Mr Macilwain Nigel Thrift, vice-chancellor of the University of Warwick, told a conference on research assessment at the Royal Society in London on October 14, 2009 (http://www.hepi.ac.uk ): “What started out as a quite sensible means of differentiation, which undoubtedly had real effects on quality, has become a damaging obsession with, I suspect, very little added value.” While in Michael Driscoll's view -- vice-chancellor of Middlesex University, one of Britain’s newer universities: “We’ve got a non-competitive system with pre-determined outcomes. It obstructs planning and discourages risk-taking, discourages collaboration between institutions, and drives a wedge between teaching and research."

 

If these conclusions are credible, it follows that those determining the methodology of measurement, manipulate those being assessed, and when it comes to university research, governments that want to obtain quick returns will look to the short term. And whether the concept of impact, i.e. economic impact, is stated explicitly, as it is by the Higher Education Funding Council for England (HEFCE) in its development of the REF, or not as is the case for Australia's ERA, "such exercises influence the behaviour of the observed, often in unforeseen ways".

 

Further to the matter of impact, Mr Macilwain notes the following contrasting views from academe vs. the bureaucracy:

 

Michael Arthur at Leeds called on HEFCE to show that such assessments are reproducible (in other words that two panels, faced with identical evidence, would come to similar conclusions) before implementing them. “If we’re going to distribute £400 million a year on the basis of this, it is incumbent on us to show that we can reproduce it.”

 

 David Sweeney, head of research at HEFCE and the man chiefly responsible for the REF, says that such a demonstration is unnecessary: “We don’t assess the reproducibility of our research output assessments.”

 

Perhaps not surprising, considering the heavy criticism being levelled at the HEFCE's efforts to find a cheaper alternative to the RAE, considerable finger pointing has evolved. As Mr Macilwain summarises it, those bureaucrats responsible for the RAE: "argue that much of the problem lies with how vice-chancellors and their administrative staffs have relentlessly used the figures generated by the RAE to tighten their grip on their institutions. Whenever an activity is terminated, a contract not renewed, or a department closed, managers tend to cite RAE results as the cause."

 

And he cites Dame Julia Higgins of Imperial College London, who chaired the chemistry panel for the 2008 RAE: “The vice-chancellors want to use it [the RAE] as a management tool; that’s the problem.”

 

Unfortunately the signs in Australia are that the federal Minister for Innovation, Industry, Science and Research, Senator Kim Carr, is gripping with bulldog ferocity to forcing through his ERA, hubs and spokes and constrictive compacts on the nation's universities while the Prime Minister continues to give every sign of remaining disinterested.

 

 

Alex Reisner

The Funneled Web