Tuesday 1 February 2011

ERA: Damned Lies and Bibliometrics

The Australian has a series of articles on the ERA report that was released yesterday. The titles tell the story Most unis below par on research, Scattergun sector needs some diversity, Elite eight head uni research ratings, No link between discovery and teaching, Uni research report a blow to big-noters etc.

The first of these articles is the most useful one because of the chart it includes, with the ERA results. The second one starts: "The release of the Excellence in Research for Australia report yesterday will give a few university bosses cause for introspection."

Not just university bosses. And this is why. The same article continues

While talk of teaching-only universities gets silenced as soon as it's raised (the teaching-research nexus is sacrosanct to the idea of a university), there is definitely a rising swell of voices suggesting the ERA results will be used to compel institutions into focusing their research efforts.

That is, if you are research active and you have the misfortune to be in an institution that has not performed well in your broad area of research, Canberra or your own administrators may decide that it is pointless you continuing to do research. You should just give up on research and only do teaching.

Of course, your broad area of research may not have performed well in the ERA exersize because the ERA has been so poorly conceived, so poorly adapted to your particular broad area of research, that it has utterly failed to take an accurate measure of the performance of that area.

Julie Hare writes (in Uni research report a blow to big-noters)

… performance in the arts, humanities and social sciences was overall pretty dismal. One reason may be the amount of research funding these areas attract … Another reason may be that because the HASS disciplines were subject to peer review: human judgment can tend to be harsher than the clear, crisp picture thrown up by bibliometrics, publications and the like.

Another way to put this is, one reason is the amount of research funding these areas are allocated (which is determined by Canberra), another reason is that no allowance has been made for the fact that the peer review process is faulty/biased/poorly conceived/poorly executed.

As for "the clear, crisp picture thrown up by bibliometrics"—hardly. Bibliometrics offers a distorted and incomplete picture, based on a fundamentally flawed logic. So, no …

Undoubtedly, for the first time we have a clear picture of not just where research activity is, but how good it is..

… we don't. And this post is an example of why:

Julie Hare's Uni research report a blow to big-noters is wrong. Julie Hare's Uni research report a blow to big-noters is misconceived. Don't, under any circumstances read Julie Hare's Uni research report a blow to big-noters.

Right, that is three times that I have mentioned the author and title: according to the present practice of bibliometrics, this is identical in value to me writing

Julie Hare's Uni research report a blow to big-noters is a work of unsurpassed genius. Julie Hare's Uni research report a blow to big-noters will remain the foundation-stone for thinking in this area forever. What would we do without Julie Hare's Uni research report a blow to big-noters?

Which is another three citations of the author and title.

Only a poorly-programmed computer thinks that these two paragraphs are identical: and only a poorly-informed administrator (or reporter) thinks that these citation-counts can tell you anything about reasearch excellence.

Which is why it is unfortunate that …

Innovation Minister Kim Carr said the results of the evaluation … would inform government about resource allocation and would be a key measure of performance in funding agreements with universities.

[UPDATE 20 May 2011: Check out this very amusing response to journal ranking madness: increase your journal's ranking by rejecting everything!]

No comments: