Impact Lab

Researchers in Urban Studies take an outward looking approach to developing research ideas and communicating findings. The purpose of our Impact Lab is to provide a forum for discussion and refection on how knowledge exchange and impact can be developed within the subject area.

The lab supports deliberation: firstly, the strategic position on impact set out by the sector as a whole, University and College, and how these can be adopted and pursued; and, secondly, broader debates on how the research and knowledge exchange interface in higher education is evolving.

The work of the Impact Lab includes an ongoing blog to provide reflections on established and emerging issues, and a series of events and workshops centred around questions within five themes.

Themes

1. Planning for impact

How do we design impact into our research approaches, be it in developing an academic article or a research grant bid? From recent attempts, what has worked well and what have been the challenges in meshing an impact agenda into a research proposal? What are the new expectations emerging from funder bodies (and others) on how impact will integrate with research design? How is challenge-led research re-shaping approaches to impact?

2. Understanding the demand for research evidence from users

What evidence are users looking for and at what points in a research project can and should they be engaged? How and in what forms is evidence put to work in policymaking? What are the unintended uses of research that researchers should be aware of, and how might that shape approaches to research design and dissemination?

3. Public engagement

How should we work with wider publics, beyond policymakers, on our research, and how should we communicate to non-specialist audiences about our research findings and why they matter. Here, public engagement may refer to (but is not limited to) connections with community groups and the third sector more widely. What are the implications of the UKRI’s “Concordat for Engaging the Public with Research

4. Impact and career development

How does the heightened importance of impact interface with career paths and day-to-day considerations such as workload? As the impact agenda takes a more central position within REF preparations, through the preparation of impact case studies most notably, what are the time demands on academics and how can these be managed and acknowledged? Also, h ow do we understand what impact implies, and needs to look like, for early career researchers  through to senior staff (and what are the range of training demands that may stem from this)?

5. KE and impact in an international context

Urban Studies exhibits a global reach in its research activities and foci.  See our International Research page.

A clear implication from this is that we need to understand impact in different contexts, where - compared with the UK - different norms, expectations and practices may prevail. What do different international contexts mean for approaching impact activities to adjudicating what and when impact has been achieved?

Activities

Events and workshops are scheduled across the year, and these vary from internal discussions to public-facing events.  

Impact Lab activities include:

  • Practice reflections from staff in Urban Studies (reporting on impact successes and challenges, for example)
  • External speakers discussing the evolution of KE and impact agendas
  • External speakers discussing the use of research and evidence

Approaching the impact agenda – negotiating corruption and procedural restrictions

The impact agenda is now a key talking point yet remains an area that is somewhat ambiguous and hard to nail down. Once a peripheral concern, impact – defined in REF 2021 as  “an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” - takes an increasingly prominent place in numerous spheres of academic life, from developing research grants through to teaching. In this note, we draw attention to three recent contributions to impact debates (as listed above). Indeed, all papers were published in 2021.

At a workshop in the autumn of 2022 – involving Rebecca Madgin, David McArthur and Rob Richardson -provoked by the papers above, we will consider the following questions:

  • Does (or can) impact corrupt academic work?
  • What are the institutional restrictions placed on communicating impact and how does this gloss over the “messy realities” of undertaking impact case studies?
  • To what extent do colleagues’ experiences of impact reflect “linear”, “relational” and “systems” strategies?

The idea that the impact agenda distorts academic work is an issue taken up by Kidd et al. Concerned for what the authors call “epistemic corruption”, the notion that impact leads academic work in to dark corners in which it doesn’t belong, is considered.  Corruption for Kidd and colleagues is: “… damage done to people’s epistemic character by their subjection to conditions or processes that erode epistemic virtues such as curiosity and thoughtfulness and facilitate the epistemic vices like dogmatism or closedmindedness”.

Based on interview data gleaned from the UK and Australia, Kidd’s framing is binary in terms of portraying epistemic “vices” and “virtues”, yet finds resonance in the notion of “corrupting structures”. Here the authors flesh out three terms: “incentive structures” – which are the “false, inflated or exaggerated” claims to impact; “increased exercise costs” – “removing or weakening social norms of truthfulness or by introducing formal and informal penalties or punishments for those who are truthful, or by changing incentive structures in ways that militate against acts of truthfulness”; and “structural constraints” – where academics are “discourage[d] or prevent[ed]” from giving truthful accounts of their research practices.

Though Kidd et al’s call for closer “scrutiny” and awareness of the impact agenda, and its corrupting potential, is well made, there is a sense in the piece of academic ‘goods’ and ‘bads’ that may not be altogether straightforward to identify (and that must surely differ across different disciplinary domains). In other words, Kidd et al’s slide toward untruthfulness, may be, for someone else, simply a matter of presentational emphasis. However, wherever one is positioned on the spectrum from academic purity to pragmatism, the paper stands as an important corrective for ideas that impact and research are always or necessarily self-reinforcing, and offers conceptual language through which to consider the potential erosion of academic principles through impact activities or engagements.

Kidd et al’s account of “structural constraints” and of what the impact system sometimes requires of participants - “… the interviewee is pointing to the fact that no scientist could genuinely endorse the smooth, tidy accounts of the process of scientific research required by the impact system – ones that present an untenably neat account of research as a process that inevitably delivers guaranteed outcomes” – marries with the concern of Bandola-Gill and Smith (2021) for narratives of impact.

For Bandola-Gill and Smith, narratives of impact case studies are the core concern, and so the empirical focus in this work is on the processes and procedures of filling out impact case studies in the UK REF context. Moving away from the philosophical language of virtues and vices, the objects of enquiry here are the negotiations and tricks deployed by impact case study writers to fit within REF requirements. As with Kidd et al, the paper is enlivened by carefully selected quotes from interviewees.

For the authors, the impact case study resembles a “technology of governance” in which writers of the case studies are disciplined and coaxed into responses in various ways. Universities are positioned as localised adaptors to “external performance assessment systems”.

A key issue alighted on by the authors, is that the case study format shapes storytelling in a “restrictive” way. That is, universities take a view on what a “convincing” impact argument looks like, with “best practice” examples rolled out to reinforce this. This creates nevertheless, tensions between the “messy realities of impact and the much neater accounts required for impact case studies”. From here, the authors flesh out the three component parts of impact narratives: the “plot”, the “moral” and the “heroes”. Starting with the plot, the authors point to the “linearity” of the case study format and structure, which belies the often messy and complex nature of impact itself; that is a veneer of cohesiveness, layered over a much more haphazard and idiosyncratic reality. The next component, the “moral” then refers to the “lesson” that stems from the case study. Here a specific outcome and discernible change, leads to quantification and metrics. It is also striking to see here the quest for monetisation in Bandola-Gill and Smith’s analysis; public value equals economic value seemingly. Finally, the “heroes”, are where academics feature as the “main characters” of the story, while others are  rendered in a somewhat “passive” though “present” way; that is the heroes, lead the charge, working with others to achieve change.

Altogether, Bandola-Gill and Smith’s portrayal of impact narrative construction shows how the complex worlds of impact, as actually experienced, are bounded and governed in order to achieve an auditable output.

In the work of Hopkins et al, there is little concern given to the ethics or politics of impact, with attention, focused, rather, on “maximising” the impact of research through a focus on organisations working to improve “research-policy engagement”. That is, the impact agenda is taken as a given and the aim is to consider how to make the best of it. The authors, based on a systematic mapping of the research-evidence landscape (coupled with interviews), identified a series of impact approaches (derived from Best and Holmes, 2010) – “linear”, “relational” and “systems” – to characterise the different relationships between evidence and policy. The authors concisely define the three approaches as follows: “Linear strategies treat knowledge as a product to be made by researchers and supplied to policymakers, focusing on the communication of research. Relational strategies foster collaboration between researchers and policymakers to produce usable knowledge. Systems strategies seek to maximize evidence use by understanding and responding to dynamic policymaking systems” [italics in the original].

Notably, Hopkins et al’s paper resembles, in form, a policy output, and this is particularly true for the “seven general recommendations” set out for “designers, implementers and users of strategies to increase research impact”. The seven recommendations are: 1. focus less on the abstract and shift to what policymakers “actually do”; 2. consider the balance of support between individuals and institutions; 3. clarify the aim of engagement; 4. arm researchers with an understanding of policy processes; 5. evaluate effectiveness in new ways; 6. evaluate initiatives based on expectations; 7. set out the nature of the relationship between academics and policymakers (client or independent)

So, despite our somewhat crude attempt at article sampling, the three articles summarised above show a lively and reflexive scholarship on research impact agendas; from considering the thorny value-based trade-offs, viewed by some, to the more procedural and efficacy-based concerns elaborated by others. These literatures provide a useful basis for reflections on our own impact practices and concerns, moreover. From Kidd et al’s concern for the principles at stake to Bandola-Gill and Smith’s and Hopkins et al’s concern for the procedural level of the impact agenda, the impact agenda can be seen to steer academic work (and experiences) at various layers of abstraction.

We will discuss this at a workshop on August 22 at 2pm, where colleagues in Urban Studies will offer reflection on each of the papers. For interested colleagues across the school, please contact David Waite - david.waite@glasgow.ac.uk - if you would like to attend.