Never combat intelligence

even if it is artificial

Let’s start with a bold statement:
I think that artificial intelligence will soon be able to research and write academic essays as well as humans can.

No value attached as I am not saying this to be a good or bad thing, but surely an interesting one. For certain colleagues, this means that genuine education will be swept away by a tsunami of cheating. This explains their need to combat those trends with evermore rules and regulations and tighter controls. For others, like myself, artificial intelligence, amongst other innovations, is yet another technical tool or aid. Schools and faculties will need to evolve with their teaching and assessment strategies, to take account of it. Helping them to do that is a big part of my job. 

panic of technologies impacting learning dates back 2500 years

Similar outcries were mentioned by Socrates or during the invention of the printing press in 15 century. They were heard in last century’s seventies, eighties and nineties as well as in this millennium's noughties for different reasons. With a great risk of forgetting stuff, just think of the arrival in classrooms of personal computers or word processors and later portable laptops, the introduction of pocket calculators doing mathematical graphs, dumb and now smart portable phones, research tools such as Google or Wikipedia, social media in general, cloud computing et cetera.
Warnings about unintended consequences for students (and tutors) and their skills have never really gone away. And now, as technology powered by artificial intelligence becomes ubiquitous, the debate has turned the volume up for another round.

Out of personal experience, the members of staff charged with enforcing academic integrity are already struggling to keep up with advancing technology. So, recently emerged Word Spinners (Lee, 2021) that help students disguise plagiarised work by changing some of the words and phrases will not help the enforcing tasks. Detecting plagiarism that has been disguised through the use of such automated paraphrasing tools is incredibly computationally hard (Ross, 2021)

On the other hand the same AI approach could potentially be of great help to tutors during assessment by saving them from tedious and repetitive labour. “AI Assistance” may offer “suggested answer groups” for questions requiring one-line textual or mathematical answers. This could allow academics to mark and give feedback to everyone who gave a similar answer simultaneously (ibid.).

An academic cold war of tech armament is not a tangible solution

Point 7 of last week’s NO-School Manifesto mentioned that we should value the use of the senses, the intellect, and imagination, rather than systems of punishment, control, and constriction.
I do not believe in the idea of fighting tech with more tech. As it is clearly only a tiny and rather ineffective part of what is really needed. The difficulty of detecting cheating (whatever that means, just think of Kobayashi Maru) only underlines the importance of education. Schools and tutors must not purely rely on technological tools. They need to work more closely with students and the real world. What we need to do is build positive relationships where we can have smart conversations about work, about citation, about what plagiarism is and what plagiarism looks like. Ultimately, all of the ed-tech companies, the cheating and the anti-cheating, frustrate those positive relationships (Stommel, 2021).



Changing the paradigm of assessment

Rather than obsessively check the possibility of plagiarised results, shouldn’t we rather think in depth on how assessment looks like, could look like and should look like?  

Let’s discuss each new tool, but this debate should never be dominated by “assessment conservatism”. The idea that one should cling on to the types of things that we used to do because of familiarity or trust in old practices, has to be thrown out of the window.
Let’s learn from history and consider how AI will influence the concept of authentic assessment:
Students must always be allowed to use “real world” tools in assessment exercises. Our think tank helps schools and companies to rethink current education & training. We develop and deliver new contemporary forms of learning that work exclusively from a position of equality. We are noticing that increasing numbers of mature students at universities further sharpen the imperative to replicate real-world conditions in programmes and especially assessments (not to forget, the learning outcomes). Students, like professionals, are looking for ways to get things done more efficiently and quickly. They want and need to learn the most, the fastest, the best, the most affordably (Ross, 2021).

Contemporary workers of higher education better ask themselves:
What is a professional going to have access to in the future? How will she or he acquire knowledge and get shit done?

A good school therefore prepares students for the world that they’re going to be active in, not just for the now, but the unknown future. If we don’t give students the opportunity to decide when it is appropriate to use AI tools or not, and to make best use of them, we’re not really giving them the sort of education they’re going to need.


Leave a comment