Why it’s crucial that universities collaborate on metrics fit for the future

06.05.2024

If universities want to escape the malign influence of league tables, they need to work together on new ways of measuring what they do. So says Professor Paul Wouters of Universiteit Leiden, Chair of the LERU experts writing group.

LERU has just published a report on Next Generation Metrics for Scientific and Scholarly Research in Europe. Why now?

There are a lot of reasons. First of all, the debate about reforming the research assessment system is gaining momentum in a number of countries, so the role of metrics is again on the table, since they are not currently up-to-date and able to play a part in that reform process. Second, there is increasing recognition of the perverse effects of standardised, one-size-fits-all indicators. Often, indicators that are perfectly valid at a higher level of aggregation, for example for universities as a whole, do not work well once they trickle down to individual funding decisions or career assessments. Third, we need to create alternatives to commercially packaged metrics. I'm not saying that companies can't play a role, but it's important that the criteria and design of metrics are not shaped solely by commercial interests. And fourth, we increasingly see the need to create open metadata. This means that the precise calculations and the raw data used to create metrics – anonymised, of course – need to be available, and this is often not the case with the current indicators.

That sounds like a lot for new metrics to achieve…

Next generation metrics are not the solution. They must be a help mechanism. The most important task is to reform the whole system of evaluation. So, if a university wants to know how it's doing, it first has to know what it wants to achieve. Currently, this is often defined in terms of competition, for example its performance in global university rankings. But this means the university’s role is defined with respect to what other institutions are doing, and standardised metrics that it can't control or even analyse properly, because they are closed.

Do alternative metrics already exist?

When it comes to research, one alternative is the CWTS Leiden Ranking, which has open metadata. If a university wants to know its position in internationally oriented research, and also in open research activities, it can simply grab the data from the ranking and check its position itself.

Can universities help themselves in other ways?

Universities have a lot of data about their own processes, but these are rarely used to measure performance or progress towards a university’s goals. In our report, we make a plea for universities not to give their data away, and then have to buy it back from the companies who produce indicators, but to start to produce indicators themselves. Universities should also work together on this, pooling the expertise needed in computer science, library science, metrics, and so on.

What are the most significant barriers to the adoption of next generation metrics?

University assessment has been going on for many decades, and the old indicators are now built into the infrastructure. So, the most important barriers are the implicit and unrecognised ways in which these indicators have influenced our whole way of thinking. That's why we need to take time and begin to experiment with new approaches.

What kind of experiments would help?

I could imagine, for example, that three or four LERU universities might collaborate on redesigning the way they do the annual performance interview with their researchers. Then another three or four universities might work together on implementing the CoARA [Coalition for Advancing Research Assessment] principles with respect to assessment of research groups. And another four universities could bring together their computer and library science people to see what data could most easily be harvested for meaningful metrics with respect to, for example, open science practices or the promotion of gender equality. Experiments like these would help people recognise that they can still use existing metrics, because they put them in a completely different context, harmonising and standardising them in a different way.

Paul Wouters - Universiteit Leiden Prof. Paul Wouters - ©Universiteit Leiden

Will it be hard to convince researchers to switch to an unfamiliar new system?

One reason we are asking for experiments is so that the new metrics will not be unfamiliar! And we are not advocating a complete revolution, where we suddenly do away with everything. In chemistry, for example, where the present indicators do say something useful, the steps would be perhaps a bit more gradual than in fields like history or philosophy, where counting citations makes no sense at all.

Will better metrics lead to better science and innovation?

Not directly, but they might lead to more liveable universities and universities that simply function better, and that would help create the conditions for better science. But whether or not a researcher is then able to be creative is not directly influenced by the metrics.

How can the university sector as a whole help improve the use of metrics?

The most important step now is for universities to join the CoARA initiative, because that will create the psychological and political circumstances necessary for researchers to start revising processes. Everybody then knows that they are not taking a risk by being the first to change, or by investing time and money in something that leaves them exposed.

How would you like to see your group’s new report used?

The report can be used in different ways by different parts of the university community. Some sections are very relevant for principal investigators and research leaders. Others are most relevant for the computer science and library science departments. Others again are written especially for the university board, the faculty deans, and other leaders. Taken together, the report has a lot of recommendations, but not everything has to be done at the same moment, and not everything has to be done by everybody. That's why we want to see small-scale experiments to create alternatives based on these principles. Then it's important that university leaders make it clear that they’re going to support this effort, with time and resources. And the authors of the paper have also expressed their willingness to act as a helpdesk, to see how this work can be implemented.

What do we risk if the use of metrics does not evolve?

It depends on which indicators remain dominant, but at the university level the present obsession with global university rankings is totally misplaced and really leads universities astray. We also run the risk of missing important breakthroughs, because researchers don't have enough time to spend on developing completely new fields. The counter argument is that researchers are highly motivated and will want to do this work anyway, but it's also true that lots of them are leaving universities because they no longer find them inspiring environments. And that's very problematic.

Your report says that next generation metrics for academic teaching will require an expert group. What are some of the challenges?

Very often student satisfaction is used and then fed back into either individual or group assessments, but this does not really measure teaching quality. Someone can be very dissatisfied with a course, but still learn a lot and find it very valuable in the long term. The alternative is some kind of peer review of teaching, but this can be intensive, since it requires people to sit in on the teaching and assess it. That's not easy to capture in metrics. And then how do you measure the quality of teaching materials? These are difficult questions, but I would like to see LERU take them on.

©LERU: Text by Ian Mundell.

Contact

  • Prof. Kurt Deketelaere, LERU Secretary-General, or +32 499 80 89 99
  • Dr Alain Smolders, LERU Senior Policy Officer Open Science & Innovation, or +32 479 98 38 32

Media contact:

  • Prof. Paul Wouters, Emeritus Professor of Scientometrics Universiteit Leiden & Chair of the LERU experts writing group,
  • Bart Valkenaers, LERU Senior Policy Officer Strategic Communication & Public Affairs, or +32 498 08 43 49