Opinion

 

June 11, 2007

Address to the Group of Eight Universities

Research Quality Framework (RQF) Forum

8 June 2007

Professor Tom Spurling, President FASTS

 

This contribution is reprinted with the permission of the Federation of Australian Scientific and Technological Societies (FASTS)

I want to take this opportunity to look at impact in the RQF in a broader context.

However impact is defined there is no doubt the ‘Polymer Banknote’ project has had an impact nationally and internationally.

It commenced in 1968 when H C (Nugget) Coombs asked a group of scientists if they could produce a forgery proof banknote.

One idea that emerged from those discussions was to have optically variable devices on a polymer substrate.

A project to do this commenced in the CSIRO Division of Applied Chemistry around 1970 and culminated in the release of the commemorative note in 1988.

In 1970 the Division had expertise in natural product and synthetic chemistry and in physical chemistry. It had no polymer research and no expertise in optically variable devices. So we had to build these up.

Dr Bob Lee, who had a PhD in nuclear physics from the UoM, but was driving trams at the time, was employed to work on the optically variable devices, or OVDs. The Captain Cook diffraction grating in the 1988 note resulted from his work but no OVD was included in the subsequent notes.

Much of his work was published in the open literature, has been widely cited, and his OVDs have had other applications.

Bob’s work was of high quality; it had to be done, was included in the 1988 note, but little impact on the final notes.

The Division employed a number of chemists from 1976 who did, and are still doing, high quality, highly cited polymer research. None of this research is included in the polymer banknotes. The polymer research relevant to the ‘bank’ project was never published.

In a study of the successor Division published in 2004 by Glaser, Spurling and Butler1, we found that the work that had commenced in 1976 was very high quality (as measured by citations) and also had high impact from its adoption by DuPont, Ciba Vision and Aortech Biomaterials. This impact started to appear in the early 1990s more that 15 years after it had commenced.

This story illustrates two points;

In thinking about the RQF, it is a good idea to also think about the sort of characteristics we might want to see in the total fabric of Australian research. That will then enable us to ask whether the RQF supports or undermines desirable system-wide characteristics.

 

The recent Productivity Commission report on public support for Science and Innovation made a good case that;

In setting out a more complex picture of the range and scope of public research, the Commission explicitly incorporated preparedness– an enhanced capacity for dealing with risks in an uncertain world - into its working definition of innovation.

 

Preparedness goes to the value of risk minimisation, developing options and critically, capacity building. Indeed, I suspect that most people would regard R&D that minimised risks in climate change, energy futures, water management and in public health as precisely the sort of outcomes it expected from tax payer funded research.

 

The Commission also made it clear that all future evaluation of public support for science and innovation “should embrace, social and environmental outcomes, reduction of risk, preparedness to meet uncertainty, and the maintenance of strategic capability and infrastructure”.

 

The idea of ‘Preparedness R&D” was central to FASTS submissions throughout the review so we obviously welcome this part of the Commissions’ findings.

However, preparedness involves some complex and somewhat intangible capabilities. If it to be taken seriously then some of the characteristics that we need to see in our institutions and research programs are;

And there is a fair bit of work to do with all of these.

The pertinent question is will the RQF undermine, enhance or be neutral in respect of preparedness?

Until we see a resource allocation model it is difficult to do anything but speculate about perverse incentives, including consequences of gaming the system. However, if and when the RQF is operationalised we will need to look really closely at whether it undermines ‘preparedness’ by producing

If we are going to assess impact then I think it means we have to have some knowledge about demand.

My experience is that projects that are defined by a user, say the Reserve Bank, may have an easily identified impact, but seldom fit into the disciplinary groupings of Universities although their success may be critically dependent on discipline-based fundamental knowledge.

Here I think the politics and policy intent of the RQF are strange.

We could describe the RQF as an attempt by Government to impose “supply-side justification”. Government is sceptical about bang for buck and insists that universities and some public sector agencies demonstrate greater focus on achieving outcomes.

That is fair enough, but we really need to start thinking a great deal more seriously about the demand-side of the issues the RQF purports to address.

As the Productivity Commission stated in the first line of its report: “Innovation is critical to Australia’s growth and its preparedness for emerging economic, social and environmental challenges”.

There is an extensive literature examining innovation in terms of economic growth.

However, there is less consideration of the processes and constraints on public sector innovation, in terms of the capacity for Government Departments to use science and R&D to identify and respond to the very social and environmental risks and challenges addressed by preparedness.

Rather than just a supply-side focus on how science and research feeds into public policy, its time we start to examine constraints and challenges on the demand-side.

That is:

So in conclusion, I think it is important that the RQF be understood as contributing to a broader picture.

Specifically, I think we need to ensure that the RQF does not undermine the high level ‘goods’ we expect of public sector research, and it is about time we really started to focus on demand side capability.


1Jochen Gläser, Thomas H Spurling and Linda Butler, Enhancing indicators Intraorganisational evaluation: are there ‘least evaluable units’? Research Evaluation, volume 13, number 1, April 2004, pages 19–32