The measure does not come out of the blue. The number of cyber risks and digital threats is increasing rapidly, and the UT cannot escape this either, says Henk Swaters, Chief Information Security Officer at the LISA service. 'We recently received reports from UT employees who saw an AI bot appear during a meeting', says Swaters. 'We took immediate action, but it shows that extra security is needed. The presence of such bots is dangerous; They can participate without you noticing.'
According to the security officer, the risk lies mainly in access to sensitive information. 'Once an AI bot enters a meeting, it may be able to access files on your computer, both documents in the cloud and on your own computer,' he warns.
Dangers of the AI bot
AI bots brings with them a variety of threats. But what exactly do these bots do when they join a meeting, and where do they come from? 'We see that they mainly come from outside Europe. During a meeting, you can recognise them by certain names, such as Notetaker, Fireflies, Otter.ai, Bot, VA or Assistant. Sometimes they even use a name that resembles that of a real participant. They also appear instead of the real participant.' The advice is therefore to refuse suspicious accounts. If they are admitted, AI bots - like any other participant - can be removed.
As soon as a bot gets access to a Teams meeting, they are capable of a surprising amount of things, according to Swaters. 'They listen in, record conversations, automatically convert them into text, and then send the minutes to all participants by e-mail. The problem is that those minutes can also end up with external recipients.'
The AI model en analyses
According to him, the danger does not pass when a meeting is over. 'Anyone who opens an e-mail with such minutes unintentionally gives the tool permission to access files and documents.' The collected data is then used to train AI models and perform analyses. 'This data may eventually fall into the hands of third parties.' Swaters emphasises that UT has a policy that states that employees may only use private applications for UT use under certain conditions.
The list of risks is only getting longer. Some bots can even automatically sign up for future meetings. 'That happens with Fireflies', Swaters mentions as an example. "When you install this tool, it is a significant risk to stored files. If many people use such tools, UT may eventually lose control over its (research) information.'
Danger over?
The access rules in Teams will apply to all new calendar events as of 3 November. But they do not yet apply to meetings that are already scheduled. 'To do this, UT people have to adjust the settings manually. Only in this way do you have control over who can and cannot participate in the meeting.'
The new adjustments should combat the dangers mentioned. But are the risks actually off the table? 'We actually wanted to make the institutions even stricter,' says Swaters. 'Still, we hope that this approach is sufficient to keep it user-friendly.' If that is not the case, he does not rule out further steps. 'It is not inconceivable that the Teams settings will become even stricter in the future.'