News & Views item - November 2007

 

Bibliometrics and the RAE Are Again Making News in the UK. (November 10, 2007)

It's patently obvious that the Michael Gross Professor of Politics & Contemporary History at the University of Buckingham, Geoffrey Alderman, is fed up to here with the UK's Research Assessment Exercise.

 

He writes in the November 6, 2007 Guardian: "I was once a supporter of the research assessment exercise, but in the long-term it has created an obsession with performance, resulting in many universities no longer appointing faculty on teaching ability...[but] there is a sane alternative, and that is to abolish the RAE altogether."

 

The RAE was forced on the sector by the Thatcher government in the mid-1980s. I supported its introduction, and I shall never apologise for so doing. The culture in most taxpayer-funded higher education institutions (HEIs) at that time militated against the academic researcher. Faculty were appointed as often as not without regard, or much regard, for their research profile - and with no regard, incidentally, for their potential as teachers. Promotion was by Buggins's turn. As a young academic it was made explicit to me that irrespective of my publications, and of their favourable reception by my peers, I would have to wait for my promotion until those who had been with my department longer had had theirs.

 

But [now] the pendulum has swung far too far in the opposite direction. Performance in the RAE has become an institutionalised obsession. When writing references for my former students who apply for faculty positions, I am now asked to speculate on 'their likely contribution' to the RAE outcome.

 

Bibliometrics, or citation-analysis,[as an alternative to peer review] is the quick, unthinking fix that is supposed to cut this cost. It may well be that in the STEM subjects (science, technology, engineering and mathematics) the number of times one article is cited in others is a true measure of its academic worth - though even here my scientific colleagues have their doubts. In the humanities and social sciences, citation analysis is scholastic madness.

 

As for the RAE, it has run its course. Reputable vice-chancellors should boycott what has become a corrosive influence in the lives of our universities. I for my part will have nothing further to do with it.

 

From the Australian viewpoint, does our university sector currently really suffer from the "Buggin's turn" syndrome that Professor Alderman refers to being prevalent at the beginning of the Thatcher reign in the UK and require RQF therapy.

 

Now less than a week after Professor Alderman's opinion piece, Universities UK have published a forty page report, The use of bibliometrics to measure research quality in UK higher education institutions which argues that the use of bibliometrics could end up skewing the data used to judge research quality.

 

The report analyses bibliometrics in STEM subjects, as well as non-STEM subjects, and the differences in citation behaviour among the different disciplines.

 

While the UUK's report agrees that bibliometrics are, as Anthea Lipsett wrote in yesterday's Guardian, "probably the most useful of a number of variables that could feasibly be used to measure research performance, and there is evidence that bibliometric indices do correlate with other, quasi-independent measures of research quality - such as RAE grades - across a range of fields in science and engineering... there are strong arguments against the use of output volume, citation volume, journal impact and frequency of uncited papers, found the report, conducted by research analysts Evidence."

 

The report finds that metrics do not take into account contextual information about individuals, which may be relevant, and they do not always take into account research from across a number of disciplines.

"Bibliometric indicators will need to be linked to other metrics on research funding and on research postgraduate training."

 

And it states what should be the "bleedin' obvious: "...there are data limitations where (often younger) researchers' outputs are not comprehensively catalogued in bibliometrics databases."

 

Ms Lipsett concludes:

 

Eric Thomas, chairman of UUK's research policy committee and vice-chancellor of the University of Bristol, said: "It is widely anticipated that bibliometrics will be central to the new system, but we need to ensure it is technically correct and able to inspire confidence among the research community.

"This report doesn't set out a preferred approach, but does identify a number of issues for consideration. It's important that the sector fully engages with the changes to the research assessment process and we hope this report provides those involved with a basis for discussion."

The president of UUK, Rick Trainor, added: "There are a great number of factors to be taken into consideration when developing the new research assessment framework, so it's essential that all those involved get it right.

 

On scrutinising the report it becomes clear that UUK offers no clear path to success for the RAE which is to follow the one being instituted in 2008.

 

For example it cautions:

 

A number of potentially emergent behavioural effects will need to be addressed, although experience suggests both that many behavioural responses cannot be anticipated and that some of these responses could jeopardise the validity of the metrics themselves in the medium term," and then, "There are no simple or unique answers. It is acknowledged that Thomson databases necessarily represent only a proportion of the global literature. This means that they account for only part of the citations to and from the catalogued research articles, and coverage is better in science than in engineering. The problems of obtaining accurate citation counts may be increasing as internet publication diversifies. There are also technical issues concerning fractional citation assignment to multiple authors, relative value of citations from different sources and the significance of self-citation.

 

And then there is this extraordinary conclusion in the report's summary:

 

...is the assessment to be of individuals and their research activity or is it of units and of the research activity of individuals working in them? How will this affect data gathering? ...It is unlikely that bibliometrics will exacerbate existing deficiencies in this regard, except insofar as research managers perceive a sharper degree of differentiation, but metrics have an inability to respond to contextual information about individuals.

 

One of the caveats regarding bibliometrics is the timeframe for the analysis.

 

It can be argued that although papers in a short, fixed-time window will not accumulate very high citation counts, they could demonstrate sufficient differentiation to categorise quality. We have shown (figure below) that early citation rates are a good predictor of longer-term performance. But, while this may be valid for large samples, it is not readily acceptable for individuals. [Our emphasis]

 

 

If Australia's Coalition retains power and insists on its version of the RQF, the overall wastage will mount into the 100s of millions; if Labor gets in and institutes its ill thought-out concept of metrics, well it'll be a different disaster, but it'll probably cost less in immediate dollars.