ChatGPT in education: how much work should we outsource to AI?

Published: Posted on

By Dr Adam Matthews
School of Education

OpenAI’s ChatGPT uses machine learning to capture swathes of internet texts to create content which is capable of human-like responses to questions and follow-up dialogue. Artificial Intelligence (AI) in this form is generative; it can create content such as audio, code, images, text and videos – many of the creative endeavours that students and academics undertake in research, teaching and learning. Headlines proclaiming the death of the lecture or the essay at the hands of technological disruption are easy to find and ChatGPT has many potential victims.

Universities are already warning students not to offload and outsource their assignment tasks to ChatGPT. Banning and policing such technologies feels both authoritarian and very difficult to achieve in practice. Reflectively considering how we accept, reject or offload tasks to new automated technologies is important for learners and educators.

ChatGPT has been described as an evolution of technology, rather than revolution. Technologies such as the pencil, the printing press, the calculator, radio, TV and the internet have ‘disrupted’ education, knowledge creation and access. Search engines, for the first quarter of the 21st century, have thrown back links to websites – generative AI can now pull the content of those links together to answer a specific question in two-way dialogue. The spell check, autocomplete and chat bots that we see today are based on similar large language models.

ChatGPT is a continuation of big data, affording opportunities for the large-scale analysis of behavioural data, or in this case, text, to identify patterns and trends. In education, generative AI is a continuation of plagiarism and academic integrity. It also prompts the wider question of how we go about teaching, learning and particularly assessment when ChatGPT can ‘get an MBA’. ChatGPT can essentially paraphrase the web, a skill we ask students to do to avoid plagiarism. As with many other technologies, this development has prompted institutions to think about the future of education and assessment.

In 2008, Wired magazine proclaimed The End of Theory: The Data Deluge Makes the Scientific Method Obsolete saying that models, ideas and hypothesis are no longer needed when we can collect data on everything. This position does, however, remove context and nuance. In a political era when politicians play fast and loose with facts, universities and education more broadly are places to call out facts, lies and nuanced positions in between. In a recent creative writing workshop, I was inspired by a fellow writer who, in response to the question of knowledge and knowing, produced the line “we need bus timetables and poetry, and to tell the difference between them”.

This technological disruption does give us the opportunity to think about writing and other creative processes beyond the end, graded ‘product’. It might be a cliché to say that it’s more about the journey than the destination, but writing can have psychological benefits and developing an argument or an artefact from a blank page, although it might not feel like it at the time, is a rewarding and fulfilling task.

Writing in the first person is a convention that can divide opinion academically. Maybe we now need the writer or maker’s voice and practice rather than the generic third person at which ChatGPT excels. Writing from a more personal perspective or in different styles is becoming more mainstream in academia. I have recently taken part in a story circles and comic production project, and have used story and dialogue to contrast different conceptualisations of the university.

ChatGPT draws upon texts that have already been written (currently everything before 2021). Teaching, learning and assessment should look to unique contexts of individuals, different societies, and their futures. This can still draw upon existing knowledge, evidence and ideas but context is unique.  There are many ideas and approaches that we can draw upon to work with, rather than against, rapid developments in generative AI. We do need to be careful not to fall into the trap of calling death to ‘traditional’ writing but think about what is being written, why, the medium and how it is unique to the individual and different contexts.

“…writing is inherently an act of connection. What emerges as we write in the moment is a multifaceted sense of self that is connected, through language, to other selves and to the world we share”



The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of the University of Birmingham.

Leave a Reply

Your email address will not be published. Required fields are marked *