News & Views item - February  2005

 

 

Development of Australia's Research Quality Framework Gets Some Media Attention. (February 14, 2005)

    On December 21st last year TFW reported that when Backing Australia's Ability II was announced the Coalition Government committed itself to developing a framework on which to base the quality of publicly funded research, and according to a media release by the Department of Education, Science and Training, "The Government is committed to ensuring that ...resources [it puts into research] are directed to areas of research excellence and public benefit."

 

The government went on to appoint an Expert Advisory Group of thirteen of which Sir Gareth Roberts, President of Wolfson College, Oxford, and chair of a committee responsible for reporting to the UK's Blair government on  SET for success: The supply of people with science, technology, engineering and mathematics skills. The Minister for Education, Science and Training, Brendan Nelson, has also promised to hold "wide-ranging consultations, release an issues paper and hold a major stakeholder forum."

 

Today the Australian Financial Review features two articles "Research review heralds a whole new ball game" by Peter Roberts, and "Try minimum 'fiddle' option" by Don Aitkin, former vice-chancellor of the University of Canberra, Member of the Australian Research Grants Committee, 1981-1985, Chairman, 1986-1987, and Foundation Chair of the Australian Research Council 1988-1990.

 

Peter Roberts says, "The way research is performed in Australia's 37 universities is likely to be fundamentally changed by a year-long review launched recently by the federal government," and suggests that unless great care is taken on what changes are instituted and how, significant and undesirable distortions of Australian research could follow. As an example of unintended consequences Roberts cites the partial linking of funding of the Research Training and Institutional Grants to numbers of peer-reviewed publications.

 

In 2001 the Australian National University's Linda Butler published a time-trend study on Australian research publications, and in a submission for the Australian Academy of Science to the Department of Education, Science and Training's "Crossroads" review, Michael Barber, then Secretary, Science Policy of the AAS, referred to Ms Butler's analysis:

Last October the Academy published a report which showed that Australia's share of scientific publications had increased markedly over the 1990s but the relative impact of Australia's publications, as measured by citations, had declined and continues to fall behind most other OECD countries. Even more disquieting are the findings reported by Linda Butler in her submission to this review that this increase in university output has occurred disproportionately in journals of lower impact. ...The Academy concurs with Butler.

Roberts now reports, "'There was this huge increase in output [of publications], some of which found its way into the top international journals," says Butler. "But a lot more of it was ending up in the "easier to get into" journals, if I can call them that.'"

 

It appears that as a starting point the Expert Advisory Group will use the UK's Research Assessment Exercise (RAE) which has taken place every four to five years and has been used by the UK government "to enable the higher education funding bodies to distribute public funds for research selectively on the basis of quality." It is considered to have been reasonably successful in achieving its aim but is now in need of revision because of real and perceived distortions that occur. Roberts reports, "UK universities have been accused of distorting their staff recruitment policies to fit in with the RAE's quality criteria. UK universities concentrate their recruitment in the year leading up to the RAE, paying big salaries to poach the most successful researchers in an attempt to improve their institution's rating." It works for football clubs why not universities' research standings.

 

The Expert Advisory Group is due to report to Dr Nelson by the end of the year. It's not clear whether their report will be made public or how much notice the minister will take of its recommendations.

 

It is also worth noting that the president of the Australian Industrial Research Group, Roy Rose, told Roberts that any quality-assessment exercise needs to take into account researchers' ability to make linkages with business where it is appropriate. Certainly there have been clear indications from the government that it is in favour of tighter linkages as witness the changes in emphases regarding work by the nation's Cooperative Research Centres.

 

On the other hand Roberts also reports Australia's chief scientist, Robin Batterham, who is also Rio Tinto's chief technologist, as saying, "The best underpinning of [business] development comes from basic and applied research that is at the cutting edge and is being recognised as world class." Whether that can be achieved by a higher education system surviving on short commons is at best debatable.

 

Professor Aitkins takes a rather different tack with regard his view of what to do for and with Australian research:

Education Minister Brendan Nelson is an industrious worker, and his industry seems focused at the moment on what to do in the area of research. Just before Christmas he set up an expert advisory group, chaired by a Briton but otherwise consisting of what are now called "stakeholders", to look at "quality" in research and how to measure it.

    Presumably the future flow of public money to research is to be affected by the research quality framework that will be the major outcome of the group's work. Presumably, too, some people want the flow changed (this is usually the case). And presumably the minister thinks this whole process is going to improve the public good.

    Nelson is also questioning the extent to which public money supports so-called "pure research" and seems to be suggesting that the balance should be weighted more towards so-called "applied research". Presumably the expert advisory group will have a go at this conundrum, too.

                                                                                                                        :

[R]esearch is almost a code word for the way we in the western world construct knowledge, all of us, not just academics. Consider how we make purchases, organise travel, insurance, investment, education, almost everything. We have a problem, we consider alternative solutions by finding out as much as we can, and we come to a decision after testing possibilities against the evidence. We do it every day. It is what makes the western world successful.    

    The mother church of that outlook on life is the university. Nearly all academics are bound up in the process of research, because it is what makes their working life exciting. My advice is to fiddle with it as little as possible. The mechanisms for allocating money are decently robust. Keep them competitive, and ensure that new players always have an opportunity to get in.

    Forget about "pure" and "applied"; they are just distractors. Just ensure that those who get the money have to explain properly why their research is important, and what happened. The outcomes will be messy, but that's because they are uncertain.

His final paragraph is interesting, it suggests that science isn't really in Professor Aitkin's comfort zone despite being a Member of the Australian Research Grants Committee, 1981-1985, Chairman, 1986-1987 and Chair of the ARC for three years. In fact his academic interests centre around history and political science.

 

And perhaps it's useful to point out once again that within Australia's universities less than 30% of research is basic "blue sky" research and almost 80% of all basic research within Australia is done by the universities.