The study, titled Educators’ perceptions of generative AI: Investigating attitudes, barriers and learning needs in higher education, found that 81.5 per cent of respondents recognised the potential of generative AI. Over 70 per cent had positive experiences, primarily with tools such as ChatGPT and Copilot, for supporting teaching-related tasks.
However, many are unsure how to apply AI effectively in their own teaching practice. As a result, actual use remains limited: only 2.8 per cent of lecturers reported using AI daily, and 14.3 per cent weekly—mostly for creating supplementary materials such as assignments or quizzes.
Growing gap
This knowledge gap may eventually create a disconnect between students’ needs and lecturers’ skills, the researchers warn. Such a gap could leave students inadequately prepared for a future in which AI plays an increasingly significant role.
Concerns over reliability
Many lecturers expressed concerns about the reliability of AI-generated content. The most frequently cited need (22 per cent) was the ability to better assess AI’s outputs. Additionally, 19 per cent highlighted the need to better understand ethical concerns and bias.
Researcher Mohammadreza Farrokhnia understands these concerns, though he believes they will diminish over time. 'You could see, for example, that OpenAI’s GPT-3.5 often hallucinated and confidently produced nonsense, but newer versions are already much more reliable. The challenge is that generative AI is still evolving rapidly, so you can’t design a fixed four-year curriculum around it. You need to continuously adapt.'
Impact on critical thinking
Many lecturers also fear that AI may hinder the development of students’ critical thinking skills. This concern aligns with a recent open letter signed by numerous concerned professors. A study conducted by the Massachusetts Institute of Technology (MIT) in June also found that students who wrote essays using ChatGPT demonstrated significantly fewer neural connections and were less engaged with their work than those who wrote essays independently.
Burying one’s head in the sand
Farrokhnia is less worried about this. 'It frees up room for other types of thinking. Of course, as with any new technology, I think it's a trade-off. A common comparison is with the calculator. We've used them for years without much thought. They’re convenient, but they’ve made us less adept at mental arithmetic. At the same time, calculators help solve complex problems by freeing up our brains for other tasks. AI can do the same. If used properly in education, it can actually enhance thinking skills.'
He adds: 'You can’t just bury your head in the sand. Students are using AI already, and there’s no stopping that. It’s better to provide clear guidelines on how they may use it.'
To do that, lecturers do need to invest some time in understanding the technology. According to the study, those who do are better equipped to guide students in using AI responsibly and in understanding its ethical implications. This also helps prevent over-reliance on AI tools and maintains students’ emotional engagement with their own learning journey.
In Farrokhnia’s view, lecturers should therefore take the initiative to deepen their understanding. 'This is simply a development within education. As educators, we need to embrace it. In the end, it’s about the students, not about personal preferences. You only need to explore the aspects relevant to your field.' The UT lecturers surveyed indicated that they would welcome clear frameworks for this.
No time-saving measure
Farrokhnia is an outspoken advocate for using AI to support lesson planning and marking. 'I’m very open and transparent about it. My students know exactly where they stand and what I consider acceptable.' At the same time, he notes that he still reviews every AI-generated output carefully before sending feedback to students. 'All in all, it takes just as much time. So no, it’s not a time-saving tool. But it does help you provide more comprehensive feedback. Human minds sometimes forget to mention a particular point for improvement. AI fills in those gaps flawlessly. That makes the feedback more complete.'
Farrokhnia does not expect AI to replace lecturers. 'Ideally, you want the best of both worlds: the human touch of a lecturer who truly knows their students, combined with the precision of AI. That’s what leads to high-quality education.'
€750K for Group Work AI
In collaboration with partners from Eindhoven and the Open University, Farrokhnia recently received a grant of 750 thousand euros to develop a social AI agent to support group work.