Artificial Ignorance

| Niels ter Meer

Contrary to the hype, this column is not written by an AI. But student columnist Niels ter Meer thinks letting computers do the work of writing and studying is the antithesis of why we’re here.

Photo by: RIKKERT HARINK

Artificial Intelligence, and in particular ChatGPT, is all the rage at the moment — I can’t avoid it. According to some, they represent a massive leap in technological progress. They can converse with you, (re)write your reports, or even write them entirely! But I think that’s missing the point, especially here at uni.

The point is that we attend the UT to further our own and our collective understanding of the universe around us. To me, the keywords are ‘own’ and ‘collective’. We study to further our own understanding, which we then project outwards through our writing. Outsourcing either to an AI is the antithesis of the academic pursuit; it’s ignorant of why we are here.

It becomes even worse when you think about how AI’s actually work. Machine learning is, in essence, throwing a big pile of data into a cauldron of linear algebra, and stirring until it looks right.  We have to wonder where we got all that data from; I don’t think we can ethically do science when the data’s provenance is dubious. But even then, when our model is done cooking, have we actually furthered our understanding, or are we just looking at a highly entropic pile of parameters, the meaning of which is everything but clear?

And then let’s talk about the string of words that pile of parameters produces. I’ll go out on a limb and assert that’s all that ChatGPT does; it produces the most likely sequence of ‘tokens’ given an input. No inferences, no thoughts, just very confident bluffing. Instead, when I write, it’s me reaching out to you through my writing; conferring my thoughts and knowledge to you. I understand (or at least mostly) my own arguments, I can give examples, point out the pitfalls — because I ran into them or at least thought about them myself. I can try to understand you, something AI’s can’t.

That’s not to say machine learning is completely worthless. It is a stretch, but surrogate modelling, image classification, or a ‘fancy search engine’ come to mind. Machine learning is a tool, and we humans make tools to — in principle — make our lives easier. But you wouldn’t (or at least shouldn’t) use a circular saw without critically thinking about the many ways it can make your life significantly harder — think missing fingers or even limbs. But some skip that thinking step when it comes to machine learning; only ogling at the pile of essays and reports they no longer have to write — without realising the ways sidestepping that writing will make their lives significantly harder later in their life. You never actually understood what you purported to have written.

Again, that’s what we’re here for. We’re here to think, to ponder, to understand — and then to communicate and apply. Not to generate Artificial Ignorance.

Stay tuned

Sign up for our weekly newsletter.