Killer robots: ‘Be afraid, be a little afraid’

| Bas Leijser

Elon Musk and Stephen Hawking already warned us: beware of artificial intelligence and ban lethal autonomous weapons. In a symposium organised by Studium Generale yesterday, three experts gave their opinion.

The lecture focused on autonomous weapons, or more specifically: drones and drone swarms employed by the military. With a speaker from the military, one from research organisation TNO, and a professor from the UT itself, there was certainly no lack of different perspectives.

‘This is leading edge technology’

Frans Osinga, commodore at the Royal Netherlands Air Force and former F-16 pilot, described how the Dutch Ministry of Defence is mostly watching from the side-line, to observe how the ethical and judicial debate is resolved, before autonomous weapons are implemented on a large-scale.

Osinga: ‘What we are discussing here is essentially happening in an empirical vacuum. This is leading edge technology and an ongoing societal debate. It also gives me a bit of a déjà vu, since many aspects of this debate are similar to the one about drone warfare from 2007-2013.’

Osinga stated that autonomous weapons can and should be used in a controlled manner: ‘Be afraid, be a little afraid.’ This view was shared by the second speaker, Pieter Elands from TNO. Both argued that this technology could be controlled, as long as humans are ‘kept in the loop’, provide sufficient and ethical input programming, and implement a ‘big red’ kill switch. The accountability, then, simply lies with the person who controls the weapon or authorized its use.

‘Guide development rather than seek control’

Peter-Paul Verbeek, UT professor of Philosophy of Technology, had a slightly different viewpoint. He argued that we as humans have a tendency to control technology, but history has shown that technology often escapes our clutches. As an example, he pointed out how we were afraid that the steam-engine would make many jobs redundant. ‘However,’ Verbeek said, ‘we simply adapted and changed as a society. While we cannot stop autonomous weapons from being developed, we can guide their development, rather than seek total control.’

He proceeded to make another analogy: ‘When a child becomes an adult, at some point we no longer hold the parents accountable for his or her actions. It should be the same when autonomous weapons are involved, especially when it has an A.I. capable of self-learning and self-awareness. At some point, you have an autonomous system making decisions on life and death, which means that we should strive to help it grow up.’

 

 

 

Stay tuned

Sign up for our weekly newsletter.