News & Views item - October 2011

 

 

Like It or Not Metrics-Based Assessments are Here to Stay. (October 21, 2011)

Maria Pavlou and Eleftherios P. Diamandis in a contribution to this week's Nature agree that while the use of metrics for evaluating scientific contributions have flaws they are as quantitative bibliometric indices "popular, in some cases increasingly so".

 

Pavlou and Diamandis point out that the finding in 1992 by Eugene Garfield, founder of the Science Citation Index, that Nobel laureates publish five times the number of papers of most researchers, and that their work is cited 30–50 times more often... helped to popularize metrics." And of course today we have added journal impact factors, the h-index and other indices derived from it.

 

The authors while arguing that bibliometics are useful, they can lead to some marvellous anomalies.

 

So it is not unusual to find technology specialists (often middle authors who make technical contributions) who have more citations than some department chiefs. One of our own technicians has 1,734 lifetime citations and an h-index of 26 (that is, 26 papers with at least 26 citations each) — scores comparable to those of a 50-year-old university professor.

 

Kary Mullis (Nobel Prize in Chemistry, 1993 [PCR]) and Fred Faloona, a supporting staff member with five papers (all with Mullis) and no publications since 1992. Faloona has more than 10,000 lifetime citations from just two papers.

 

Pavlou and Diamandis caution that  "no single index, formula or description will capture the diverse contributions of scientists to society", but nevertheless, bibliometric analyses are here to stay. And as long as their shortcomings are taken into account, they can be valuable, allowing observers to draw conclusions about a scientist's productivity, quality of research and impact in science."

 

However, they also quote Albert Einstein: Many of the things you can count, don't count. Many of the things you can't count, do count.

 

And when it comes to evaluating research proposals for funding there really is no substitute for critically evaluating the material put before you and considering 1) is it worth doing, 2) is the guy proposing it qualified to do it and 3) does the guy have the wherewithal with which to undertake it? Assuming of course that you're competent to judge.