Peter Hall Reports on a DEST Workshop on Evaluating Governmental Reforms in Support of Research and Research Training |
Notes on a Workshop in connection with "evaluation of the reforms for research and research training introduced by 'Knowledge and Innovation: A Policy Statement on Research and Research Training,'" held at the ANU on 27 August.
The
Department of Education, Science and Training chose Chris Fell, President of the
Federation of Australian Scientific and Technological Societies (FASTS) to chair
its Workshops which are being held in Canberra and state capitals through
September 2nd. Professor Fell is being assisted by representatives
from the Australian Research Council (ARC) and the Department of Education,
Science & Training (DEST). The conveners began with a brief survey of the
government's current procedures for funding, and evaluating, research in
Australian universities. Professor Fell stated that the government has "no
hidden agenda" for the process, which is designed to inform the Minister for
Education, Science & Training , Brendan Nelson, in time to prepare proposals for
the 2004 budget round.
The government is indeed genuinely interested in acquiring information, although
the academic community (and the research community more widely) is becoming
disenchanted, not to say jaded, by pressures to respond to repeated requests to
make submissions to government enquiries about funding various aspects of higher
education. The community does not see its lot improving significantly as a
result. However, researchers are characteristically optimist; so we press
on.
DEST block funds for research are distributed through research training
programs, in particular the Research Training Scheme (RTS), direct scholarships
(for Australian and international students) and programs for more general
research purposes, in particular the Institutional Grants Scheme (IGS ) and
Research Infrastructure Block Grants (RIBG ). By and large, project-based
funding is provided by the ARC, the National Health and Medical Research
Council (NHMRC), and the Cooperative Research Centre (CRC) scheme.
The Workshop was a successful event, enabling researchers to air many of their grievances about the current block-funding system, in the presence of government representatives who genuinely wanted to learn. However, I'd urge the research community to make concrete submissions to the enquiry. |
Reflecting the fact that the theme on which the government is seeking proposals
is subtitled Reviewing Australia's Block Research Funding Schemes, the
Workshop addressed primarily RTS, IGS and RIBG programs. However, discussion
touched on the other aspects of funding. A great deal of attention was given to
ways in which research performance is measured; of course, this plays a dominant
role in determining the amount of money that any given university receives
through the RTS, IGS and RIBG programs. Since success in competitive grant
schemes, as well as in attracting non-federal government money for research, is
a major determiner of success in the RTS, IGS and RIBG programs, all these
aspects of research funding are closely linked.
The introductory part of the Workshop is probably common to its incarnations in
the five other Australian capital cities, where it is being held between 20
August and 3 September. My subsequent account of discussion will focus on the
discussion at the Canberra Workshop, which was overwhelmingly attended by
Australian National University (ANU) staff, with some representation from
the Australian Defence Force Academy (ADFA), the University of Canberra and
Charles Sturt University, with some onlookers from the ARC and DEST. One could
not help but think that some of the latter souls, like many of the university
people, among whom they sat, were attending in order to try to fathom the logic
of DEST's funding formulae.
Substantial concern was expressed at the manner in which research performance
was currently measured, and at the way in which this altered the behaviour of
university managers, often towards ends that were diametrically opposite to the
aims of the government's programs. For example, formulae for rewarding
universities in the RTS scheme encouraged institutions to retain their Honours
students ("as long as they still had a pulse," to quote one participant) rather
than permit them to move to other institutions, even when such a move was
clearly in the students' interests. DEST's four-year completion deadline for PhD
students came in for significant criticism. One of the ANU's managers argued
that completion times of about five years, for PhDs in experimental sciences,
were "about average" in international terms.
The use of numbers of publications, one of DEST's indices of "performance" which
worries many university researchers, came in for its due share of criticism.
More generally, it was argued by participants that crude count data (numbers of
dollars earned through competitive grants, numbers of PhD student completions or
numbers of papers published -- all are extensively used by DEST) are so
inadequate as measures of research performance that they do at least as much
damage as good. The dangers of using citation count data, in place of the
existing measures, were also noted by one participant.
Some of the consequences of these naive approaches to assessing performance were
expounded upon. They included directing university research away from "public
good" activities, towards work which earns quick dollar incomes; and turning
researchers' attention from important long-term projects, towards those which
fit neatly into three-year ARC Discovery grants. (Some of the critical problems
which face Australia, for example related to land and water usage, are not
adequately addressed for this reason.) The use of commercial and state funding
to lever federal funding, through the federal government's block grant schemes,
which reward most external earnings, was lamented for skewing research programs
in ways that were markedly detrimental to the nation's interest.
The fact that the formulae which drive the government's block research funding
schemes are especially detrimental to some research fields, for example the
intellectual sciences, was stressed. In the past, some university managers have
chosen not to notice these "little details", and (for example) have attempted to
force mathematical scientists to achieve the levels of research funding that are
usually associated with experimental scientists who purchase large, expensive
items of equipment.
This widely perceived inadequacy of the federal government's way of apportioning
its block-grant cake for research, led inexorably to discussion of the "British
model." Rightly or wrongly, the UK (and now Austria, too) have appropriated
Australia's scheme of student loans for university fees, and in return we may
adopt a version of the British Research Assessment Exercise (RAE). No other
proposal for salvaging Australia's block grant programs was given much attention
at the Canberra workshop, except for a brief flirtation with the possibility of
doing away with it altogether (and putting the funding entirely up for
competitive bidding -- let's hear no more of that).
The RAE is a department-based, nationwide assessment of performance by
specialist panels of experts in discipline areas. By and large it is based on
assessing actual research outputs; one paper per year, for the period since the
last review, is nominated by each researcher who is up for consideration. This
distinguishes the RAE from the Australian approach, which relies heavily on
research inputs (such as numbers of dollars earned from competitive grants).
However, research inputs do tend to feature in the RAE, at least in recent
years; indeed, since the assessment is made by human beings, who often know the
work of the people they are assessing, a great many things are unavoidably
addressed which do not contribute to the formulaic approach currently adopted in
Australia.
The discussion of the RAE approach, during the Workshop in Canberra, focused on
these potential advantages as well as on drawbacks, of which perhaps the main
one is cost. For example, in a relatively small research community, such as our
own, getting the RAE program to work might require significant involvement of
foreign experience, which would add to the expense. However, it was argued that
one would not have to operate the RAE continually, into the future. The British
experience is that it can now be wound back somewhat, after a little more than a
decade of operation; it having largely achieved the goals which were originally
set out for it. These include ensuring a strong public perception that
university research has risen to a higher level, and is being maintained there.
The importance of public perceptions of university research performance was
raised by several participants in the Canberra Workshop.
It was also noted that one needs to spend money in order to improve the
Australian research enterprise. I was encouraged to find that the Workshop's
conveners did not turn deaf ears to this postulate.
More generally, the Workshop was a successful event, enabling researchers to air
many of their grievances about the current block-funding system, in the presence
of government representatives who genuinely wanted to learn. However, I'd urge
the research community to make concrete submissions to the enquiry; don't leave
this crucial matter in the hands of those few of us who attend 150 minute
Workshops. You can find out more about the review process, and how to make a
contribution, on the website:
http://www.dest.gov.au//highered/ki_reforms/default.htm
Peter Hall, Canberra, 27 August 2003