ߣߣƵ

REF’s experiment with research culture was always doomed

Pretending research environments could be measured by metrics or policies ignored how scholarship actually relies on peer-to-peer relations unique to academic cultures, say Martin Holbraad, Dan Nightingale and Aeron O’Connor

Published on
April 13, 2026
Last updated
April 13, 2026
Researchers discuss an experiment
Source: gorodenkoff/iStock

Just over four months ago universities woke up to a  in one of the Research Excellence Framework (REF)’s most ambitious proposals for 2029: to assess UK universities partly on the basis of their research cultures.

This about-turn, announced on 10 December, followed a Research England pilot study of the ways research culture would be assessed and measured. The pilot revealed that while assessing research culture at scale still has merit, how to do this consistently within and across institutions is a major stumbling block. Accordingly, the “people, culture and environment” element of REF 2029 had been rejigged and renamed “strategy, people and research environment”.

As anthropologists, we have been conducting ethnographic research since 2022 on how research culture is understood and experienced within the intricate ecosystem of universities. The REF’s reversal does not come as a surprise to us because, as we have found, formal efforts to foster positive research cultures focus heavily on institution-wide processes and policies, which sit awkwardly alongside – and are sometimes in friction with – the everyday realities of research culture.

These everyday realities – in essence, the cultures that shape the experience and outputs of research – are made up of a varied tapestry of relations and local dynamics between researchers and other university staff. But this “relational” take on research culture is quite distinct from the “processual” take, which focuses on the institutional processes through which research culture is formally imagined, managed and evaluated.

ߣߣƵ

ADVERTISEMENT

Of course, clear and transparent processes and structures are indispensable to fostering a positive research culture. However, the fact remains that the relationships that actually make up these cultures are the lifeblood of research.

Take the experience of Andrew, a junior lecturer in a particular department of a Russell Group university. Before taking up this post, Andrew gave a seminar presentation at the same department. The paper was well received and when a lectureship opened several months later, he applied and eventually secured the job. On reflection, Andrew believes his memorable paper helped him stand out among the longlisted applicants (chosen with reference to essential criteria in the job description) and make it on to the shortlist.

ߣߣƵ

ADVERTISEMENT

Now, a process-focused understanding of research culture would likely raise concerns about this case. How would the hiring committee’s putative perceptions of Andrew’s seminar paper align with principles of fair recruitment? If Andrew’s surmise is correct, is this not an example of undue bias? These are legitimate and important concerns.

However, what this way of viewing research culture fails to take into account is the inherently relational quality of research, which seminars express and foster as focal social events in departmental life. For early-career researchers in particular, they offer opportunities to break free from institutional hierarchies, be recognised beyond immediate networks, and leave impressions that may, indeed, later influence career trajectories. Alongside rigorous, fair and transparent processes, academic hiring unfolds within a social landscape shaped by these relational encounters.

How, then, might the distance between these on-the-ground social dynamics and the often standardising processes that are used to manage “Researcher Culture” (emphasis on upper-case R and C) be bridged?

As an example of the complexities involved, take the “Research Culture Report” (RCR), an interactive data visualisation platform created by one university in our study in order to make research culture within the institution visible to university management. The idea was to present “the word on the street in a structured way with colours”, according to someone involved in its design. It presents highly condensed summaries of academics’ “lived experience” (taken from survey data) alongside various quantitative indicators to provide a snapshot of local research cultures across the institution and to support institution-wide interventions.

In our ethnography of the RCR’s creation and implementation, however, we found divergent expectations between the tool’s users and producers. Some of the staff (including academics) who fed into it felt it condensed complex qualitative findings into generic-sounding policy points lacking sufficient detail to be actionable, such as “there are concerns over fairness and transparency of promotions”. On the other hand, staff developing the RCR felt that this condensation was necessary for it to be effective as a tool for gauging research culture “at a glance” and driving improvement in a strategic, institutionally consistent way.

ߣߣƵ

ADVERTISEMENT

This exemplifies a productive tension between relational and processual perspectives. Both parties recognise and are responding to “relational” (social, cultural) issues (such as recruitment practices, team dynamics, staff well-being and collegiality). But processual logics rest on the assumption that everything of importance should be measured (and measurable). In reality, the relational characteristics of research culture are remarkably complex to measure (if not at least partly immeasurable).

So how might the desire to improve research culture be heeded while avoiding the impasse that the REF’s focus on processes has led to? Our suggestion is threefold.

First, effective interventions in research culture need to recognise that the problem is not solely one of devising processes, strategies or measurements. Crucially, the task is also to understand how those interventions articulate with the relational dynamics of cultures of research on the ground. This involves iterative, context-specific engagement with researchers. Institution- or sector-wide frameworks or measurements are a poor substitute for this.

ߣߣƵ

ADVERTISEMENT

Second, bridging the gap between processes and relations requires institutions to work at the problem from both ends. The work on developing road maps, metrics, training programmes, best practices and so on should be complemented by the work of resourcing grassroots and localised initiatives led by researchers themselves that foster the relationships that make research communities work best.

Last, as anthropologists we would point out that ethnographic research methods – available to many universities through their staff and students – are a powerful tool for engaging with the relational fabric of research culture. What makes ethnography unique is precisely its relational quality – understanding social and cultural dynamics by engaging with them. Ethnographic methods can provide institutions with a robust evidence base that probes the context-specific relationships that make research work.

The difficulties REF ran into with its processual approach to research culture should be a wake-up call. Research culture still matters, but to foster it, universities must take heed of its inherently relational character and avail themselves of the relational tools needed to cultivate and improve it.  

Martin Holbraad, Dan Nightingale and Aeron O’Connor are social anthropologists at UCL.

ߣߣƵ

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

Related universities

Reader's comments (8)

Interesting article, but also please note that many of these research cultures are not institutional at all, or even national, but are international. Many academics think they receive little help or support from their home institutions (in fact the reverse), and much of their stimulus comes from their global epistemic networks...
I take the point about the limitations of "the processual" approach but then if the "relational characteristics of research culture" are "remarkably complex to measure (if not at least partly immeasurable)." then why try and assess research culture at all?
Ok but as far as REF is concerned the point is not to achieve a rich and detailed understanding of localised research cultures via some vast ethnographic project investigating them. It’s simply to incentivise ‘good enough’ practices and institutional behaviour, to keep universities somewhat honest in how they actually support and resource their research systems. So it’s right that process and system are where the focus sits; if we were evaluating the safety practices of an airline we would look at their procedures and record-keeping more than how employees relate to each other.
In light of the Professoriate’s vote of no confidence in the Vice-Chancellor Katie Normington among four no confidence votes and student protests, it is necessary to articulate the depth of concern surrounding the recent imposition of unilateral Key Performance Indicators (KPIs) on all professors across Arts and Humanities and other faculties at De Montfort University. These contractual variations — redefining professorial roles as 20 percent research-only positions conditional on meeting two-year KPI targets — amount to an unlawful breach of contract and employment law. They were introduced without consultation or consent and operate through coercive conditions that have no grounding in sector norms or in legitimate measures of academic performance. The KPIs themselves have been modelled, in effect, on the performance patterns of the top 10 percent of researchers in STEM disciplines at elite Russell Group universities, and then applied wholesale across all disciplines. This methodological transposition is entirely inappropriate. It fails to acknowledge the structural, epistemic, and economic distinctions between the Arts and Humanities and STEM research. In STEM, large-scale external funding and annual grant cycles are integral to laboratory-based research environments; outputs are generated through collaborative projects supported by external sponsors. By contrast, Arts and Humanities scholarship depends primarily on critical inquiry, independent authorship, archival and library research, and small-scale grant schemes — most of which are modest and sporadic, not annual multimillion-pound bids. Applying STEM-derived KPI metrics to humanities professors thus produces targets that are unattainable in practice and conceptually incoherent. For example, the demand that professors produce REFable 3* outputs annually and secure grant applications of £100,000–250,000 per year effectively conflates two incompatible models of scholarship. The framework treats every arts academic as a STEM principal investigator operating within an industrial research economy. In English Literature and similar fields, where research is funded through QR allocations and sustained by published monographs, editing, or critical essays, forcing annual large-scale grant submissions not only distorts the purpose of the discipline but risks inviting unethical behaviour. Professors could feel obliged to invent inflated or unnecessary costs merely to simulate compliance — a practice that would constitute research misconduct under UKRI’s integrity standards. The requirement to “generate income” on this model misrepresents the nature of Arts and Humanities research, which does not require laboratory budgets or commercial partner funding for its advancement. These KPIs, when tied to conditions of contract renewal or research time allocation, have created a workplace culture of fear, intimidation, and silence. They operate not as developmental benchmarks but as punitive instruments of control, used to threaten academic staff with the withdrawal of research hours or demotion through enforced workload variation. Faculty members are coerced into metric compliance under the constant implication that failure to meet unattainable STEM-modeled targets constitutes underperformance. Such conditions fundamentally breach the implied term of trust and confidence in the employment relationship and undermine academic freedom, collegiality, and the right to pursue research without managerial duress. The use of top-tier STEM criteria to judge Arts and Humanities scholarship is therefore not a matter of poor calibration but of structural injustice. It erases disciplinary difference, disregards lawful contractual protections, and replaces collegial evaluation with numerical surveillance. Professors are being required to meet targets that would only be feasible for the most heavily subsidised experimental researchers at the richest universities in the country. Within a post-92 institution, this approach is not simply unrealistic — it is institutionally destructive. It sets academics up to fail, erodes morale, and threatens the university’s capacity to produce genuine scholarship. If unchallenged, this regime will cement a managerial culture rooted in fear rather than trust — a climate where intimidation replaces integrity and compliance replaces creativity. The necessary response is immediate rollback of the unilaterally varied contracts, suspension of all conditional KPIs, and restoration of lawful, discipline-sensitive standards for academic research and performance measurement. Any serious commitment to quality in higher education must recognise diversity of scholarly practice and affirm that excellence cannot be enforced through fear or quantified through borrowed STEM metrics.
Yes these came in directly after the 4 no confidence votes.
Katie Normington's DMU: Unilateral contract variation – The university has altered all professorial contracts without consent, breaching both contract and employment law. Professors formerly on 40:40:20 contracts (teaching : research : admin) have been moved to 80 percent teaching and 20 percent research, with the 20 percent now made conditional upon meeting new Key Performance Indicators (KPIs). Invented concept of contingent research time – Management has introduced a wholly new and contractually unsupported idea that research hours are only granted to academics who are members of a university research institute. Professors must now join or remain in institute membership to access their contractual research allowance, regardless of whether that institute is relevant to their discipline. Two-year renewal cycle – Professorial institute membership and associated research time are subject to renewal every two years, conditional on meeting KPI targets. This effectively subjects long-term employment rights to periodic managerial review and erodes contractual security normally attached to academic posts. Unrealistic and non-sectoral KPIs – The KPI model copies performance patterns of the top 10 percent of STEM researchers at elite Russell Group universities and applies them indiscriminately to all academics, including those in Arts and Humanities. Targets such as annual grant applications worth £100k–£250k with industry partners and annual REFable 3* outputs far exceed disciplinary norms and funding ecology. Disciplinary incoherence and ethical risk – In fields such as English Literature and related humanities, research is library- and archive-based, funded via QR allocations, and does not typically require large external budgets. Forcing annual commercialised grant-seeking risks encouraging invented costs or unnecessary applications, behaviour potentially qualifying as research misconduct under UKRI integrity codes. Use of coercion and intimidation – This framework operates through managerial coercion. Professors are made to fear loss of their remaining research time or effective demotion if they fail to meet arbitrary KPIs. The result is a culture of intimidation and silence, incompatible with academic freedom and collegial discourse. Erosion of lawful employment protections – Linking the continuation of a contractual term (research time) to KPI performance and institute membership is a fundamental breach of express and implied contractual rights. It amounts to an attempt to override employment contract terms through managerial practice rather than lawful negotiation. Distortion of institutional priorities – By prioritising metrics designed for elite STEM laboratories, the policy undermines the university’s arts, humanities, and social research base. It substitutes numerical compliance for intellectual merit, producing performative activity instead of genuine scholarship. Consequences for research culture – The new framework replaces collegial academic governance with surveillance and fear. Professors are subjected to constant conditionality, making De Montfort’s research environment punitive and unstable rather than developmental or creative. Required corrective action – This policy must be suspended pending legal and contractual review, institute membership should not be tied to workload entitlement, and research hour allocation should return to the contractual 40:40:20 proportionality, recognising legitimate disciplinary diversity and academic autonomy. There is now a complete breakdown of relations between the Vice-Chancellor, Professor Katie Normington, and the wider professoriate and academic staff. The situation has deteriorated to the point where senior academics report being unable or unwilling to engage collaboratively with her, creating an atmosphere of profound disengagement and mistrust. Despite clear evidence of this collapse in confidence, the Board of Governors appears institutionally captured and inert, failing to acknowledge or act upon the governance crisis. Leadership at the university is effectively paralysed, with decision-making stalled and morale at its lowest point in recent memory.
Gosh how would the meet those KPIs and when 80 percent teaching and admin? Glad I am at UCL, we only expected 3 x 3* outputs in entire REF cycle and no grants needed in Eng lit and other arts humanities subject let alone every year.
The professors affected have, in my assessment, strong claims across multiple heads: wrongful dismissal, unfair constructive dismissal, breach of the Education Reform Act 1988's academic freedom provisions, and — if collective bargaining agreements were bypassed — TULRCA violations. The institutional and personal exposure of the VC is, on these facts, substantial.

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT