色盒直播

Universities must promote AI augmentation, not automation

When technology is deployed to augment individuals’ capacities, they retain the power to shape social and political outcomes, says Stephanie Marshall

April 7, 2025
A robot helps a man write something, illustrating AI augmentation
Source: Valerii Apetroaiei/iStock

Since the launch of ChatGPT in November 2022, most academics have been grappling with how to maintain the integrity of learning and assessment. For university leaders, however, the focus has been on anticipating and mitigating the existential risks posed to their institutions by AI, especially in an era of multiple financial challenges.

Those threats include shifting student expectations, restructuring of the workforce and changing government priorities. Indeed, in many countries, the last of those has already happened – and we need to respond to it.

That is certainly true of the UK. Earlier this year, the government affirmed its ambitions to make the country an AI superpower and, to that end, integrate AI into education.

At global edtech exhibition the Bett Show in January, for instance, education?secretary Bridget Phillipson? the need to “take up this great new technological era to modernise our education system, back our teachers and deliver for our children”. Her comments came just a week after science secretary Peter Kyle launched the government’s , which encourages higher education institutions to expand the provision of AI-literate graduates and help “shape the AI revolution based on principles of shared economic prosperity, improved public services, and increased personal opportunities”.

色盒直播

ADVERTISEMENT

But how should universities respond to such mandates – both as teaching institutions and as large employers?

Erik Brynjolfson, director of the Digital Economy Lab at the Stanford Institute for Human-Centered AI, argues that shared economic growth is to whether technology augments or automates human labour.

色盒直播

ADVERTISEMENT

When technology is used merely to automate and replace human capacities, individuals lose both income and political power, risking destabilisation of democratic institutions. However, when technology is deployed to augment individuals’ capacities, they not only gain control over immediate outcomes but retain the power to shape social and political outcomes. It is this that universities need to promote.

Both private and public organisations are increasingly considering which tasks may be automated and to what extent. Within universities, departments and functions are reviewing how AI-powered tools can enhance efficiency and effectiveness in areas such as admissions, human resources, marketing, grading or the delivery of education and research programmes.

An example from the private sector is JP Morgan’s efforts to develop a . The tool assigns a “confidence score” to candidates based on publicly available information about them, such as the proximity of their connections to current employees and their fit for the role.

Putting aside for a moment the potential ethical risks of such systems – which could perpetuate existing and inequitable patterns of hiring or assessment – the crucial point to note in the debates on automation versus augmentation is that even the most automated systems require continuous rounds of iteration – that is, augmentation: human input.

色盒直播

ADVERTISEMENT

For instance, while JP Morgan developed an automated recruitment system, its development was made possible only through the contributions of a team of HR employees over the course of a year. And although the system will allow automation for a period, it will need to be updated as the organisation’s operational environment changes.

The managerial lesson here can be applied across university functions and beyond. Since, at this point, we are far from achieving full automation of tasks, the efficiencies we seek as universities ought to be pursued through a collaborative and augmentative relationship between technology and individuals.

Without rushing into the logic of automation, university leaders must encourage and empower individuals to identify principles, rules and objectives to shape and reshape technology, while closely observing how individual and collective behaviours and functions evolve through the interaction between humans and new tools. Our plans should continually focus on augmenting our human resources, or what I refer to as “empowerment” in my 4E leadership model (Engaging, Energising, Empowering and Engaged).

We should ask: How can our students become better lawyers, researchers, writers, engineers, strategists and entrepreneurs by leveraging AI? How can our academic staff enhance teaching with emerging edtech tools? How can we reach broader audiences, teach more effectively, instruct more efficiently, and foster deeper learning?

色盒直播

ADVERTISEMENT

In the debate on automation versus augmentation, universities have a unique incentive, function and responsibility – not merely to replace or substitute human input and intelligence but to augment individual and collective capacities.

While the primary mandate for universities is to train AI professionals, institutional AI education strategies must go further. They must recognise that the sustainability of the institutions we envision will ultimately depend on aligning with the broader ambitions of the AI Opportunities Plan: shared economic prosperity, improved public services and increased personal opportunities.

色盒直播

ADVERTISEMENT

is vice-principal (education) at Queen Mary University of London. The third edition of her book, Strategic Leadership of Change in Higher Education, is forthcoming.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

Reader's comments (1)

new
The theme of augmentation is well and good. However the first step in any information system process should be change analysis. what is the need for change? based on what evidence. Then one considers possible solutions. You seemed to be jumping to one solution augment with "AI" just because it is there without following due intellectual process. If the aim is to make cost savings then that needs to be quantified and offset against he risk that "AI" services might increase in cost. What about the institutional risk due to hallucination/poor training data? Yes Staff are more than machines for ticking boxes and making recruitment decisions. Their experience of and however to rephrase Menchen: "No one ever went broke underestimating the desperation of universities for quick and easy solutions". Introducing "AI" as a tool to augment judgement risks increasing reliable of that tool in the longer term, on the grounds "we've made this investment we need to get full value from it" without considering what the original problem actually was.

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT