https://www.polity.org.za
Deepening Democracy through Access to Information
Home / Opinion / The Conversation RSS ← Back
Africa|Paper|PMC|System|Systems|Training|UCT
Africa|Paper|PMC|System|Systems|Training|UCT
africa|paper|PMC|system|systems|training|uct
Close

Email this article

separate emails by commas, maximum limit of 4 addresses

Sponsored by

Close

Article Enquiry

South African universities need to rethink how they evaluate research: social impact counts too

Close

Embed Video

South African universities need to rethink how they evaluate research: social impact counts too

The Conversation logo

17th September 2024

ARTICLE ENQUIRY      SAVE THIS ARTICLE      EMAIL THIS ARTICLE

Font size: -+

The ConversationThere are many ways university researchers can share their work. Some of these methods are well-established in academia: people write journal articles, book chapters and entire books; they present papers at academic conferences. Other approaches are less traditional – writing blogs and opinion pieces, or producing creative works like theatre performances, documentaries and more.

Earlier in my career, while working as a research librarian at the University of Cape Town (UCT) in South Africa, I noticed that researchers who preferred those less traditional methods were viewed by university administrators and other academics as “less productive” than their peers.

Advertisement

That’s because universities and funding agencies depend on quantitative approaches and traditional outputs to measure research impact. For instance, they want to know how many journal articles you’ve published and how many times an article has been cited or referred to by other scholars. A journal’s impact factor is particularly important. This refers to the average number of citations to research articles in that journal over the preceding two years. It is viewed as an indicator of the journal’s relative importance when compared with others in the same field.

The greater a researcher’s impact, the better their chances are of securing research grants, being promoted or receiving institutional excellence awards. Universities’ own funding also relies to some extent on publications – they receive research publication subsidies from the Department of Higher Education and Training. The more “productive” its academics, the higher its subsidy allocations.

Advertisement

But these quantitative indicators fall short of recognising and rewarding the many aspects on which a healthy scholarly ecosystem depends. They don’t, for instance, capture how a researcher’s work is producing change at local, community or societal levels.

Globally, there has been a drive in the past decade to prioritise the societal impact of research. Several global scholarly initiatives, among them the San Francisco Declaration on Research Assessment (DORA), advocate for the elimination of journal-based measures in favour of evaluating research qualitatively.

However, in South Africa and a number of other countries on the African continent like Egypt, Kenya and Nigeria, universities tend to rely on indirect, quantitative measures. There are some measures of and awards given to research related to societal impact. But these are few and far between. Overall, assessments are still heavily weighted towards the more traditional impacts.

In a recent paper with my PhD supervisor Professor Jaya Raju, I explored the shortcomings of quantitative assessment measures. We distributed questionnaires and interviewed researchers and staff at UCT to see what they believed should be done differently.

Based on our findings, we argue that universities cannot meaningfully contribute to social issues unless they rethink how they evaluate researchers. In the current system researchers are pushed to prioritise research publications over anything related to societal impact. Public institutions and their researchers are funded by taxpayers’ money. They have a responsibility to contribute to addressing societal problems. Part of the way this can happen is by embracing responsible, responsive, socially-focused research evaluation systems.

Key challenges

We identified three key challenges with quantitative measures.

The first relates to how these measures affect researchers’ behaviour. By focusing on certain metrics like number of publications or journals’ impact factors, universities push researchers to “publish or perish”. This leaves them with less time for more socially-focused research.

The second challenge is discipline coverage and global south representation. Academia is dominated by publishers located in wealthier countries, particularly in the northern hemisphere. Journals also tend to focus on natural sciences disciplines. Research has shown that humanities, arts other social science disciplines are less represented in academic databases than the natural and hard sciences.

Moreover, publishers tend to show a bias towards global north journals and English publications. This means that African scholarship remains largely hidden: a 2023 study showed that, of the 2 229 African journals that exist, only 7.4% were featured in Web of Science and 7.8% in Scopus databases. This is a problem when research assessment systems prioritise quantitative approaches and traditional outputs – a lack of visibility in big journals and databases means researchers may be seen as “unproductive” when they are anything but.

The third challenge is that the data underlying these quantitative measures is open to misinterpretation and misuse. It has been shown that journal impact factors can be manipulated or inflated.

Shifting paradigms

I spoke with researchers in various faculties and at different stages in their careers who expressed their frustrations with the current evaluation model. One told me:

Current evaluation systems privilege researchers who have no responsibility outside of themselves and their institution … it privileges researchers and not people (who are also researchers) trying to change unjust systems or think about alternative systems.

Many of my interviewees felt there was no room in current research evaluation practices for paradigm-shifting thinking – the sort that would give researchers the space to show the impact of their research activities through qualitative indicators. Impact case studies are one example. These are evidence-based stories about the difference the research has made to the real world.

What can be done

We’re not suggesting that all quantitative measures should be set aside. Instead, they should be combined with both established and emerging qualitative indicators. Doing so can contribute to more inclusive, equitable research evaluation systems.

For instance, we recommend that universities in South Africa adopt context-sensitive and responsible indicators. An example would be using median or normalised citation scores rather than journal impact factors to measure researchers’ output.

Similarly, university leaders, policymakers and funders need to advocate for and support context-sensitive responsible indicators in research evaluation. Higher education leaders, funders and policymakers also have a role to play. By collaborating at a national level, they can initiate and support the reform of research evaluation.

Written by Andiswa Mfengu, Lecturer, University of Cape Town

This article is republished from The Conversation under a Creative Commons license. Read the original article.

EMAIL THIS ARTICLE      SAVE THIS ARTICLE ARTICLE ENQUIRY

To subscribe email subscriptions@creamermedia.co.za or click here
To advertise email advertising@creamermedia.co.za or click here

Comment Guidelines

About

Polity.org.za is a product of Creamer Media.
www.creamermedia.co.za

Other Creamer Media Products include:
Engineering News
Mining Weekly
Research Channel Africa

Read more

Subscriptions

We offer a variety of subscriptions to our Magazine, Website, PDF Reports and our photo library.

Subscriptions are available via the Creamer Media Store.

View store

Advertise

Advertising on Polity.org.za is an effective way to build and consolidate a company's profile among clients and prospective clients. Email advertising@creamermedia.co.za

View options

Email Registration Success

Thank you, you have successfully subscribed to one or more of Creamer Media’s email newsletters. You should start receiving the email newsletters in due course.

Our email newsletters may land in your junk or spam folder. To prevent this, kindly add newsletters@creamermedia.co.za to your address book or safe sender list. If you experience any issues with the receipt of our email newsletters, please email subscriptions@creamermedia.co.za