‘Technology develops at the cost of humanity’

| Marie Klinge

Three experts discussed the topic of ‘safety versus privacy’, in yesterday evening’s Lightbulb Chat at the DesignLab. They turned the event into a vivid encounter, discussing technology and humanity, big technology companies and risks. ‘You need a security expert when designing a product.’

Photo by: Suyash Sharma

Every quartile the DesignLab organizes a Lightbulb Chat for students, employees, and experts to create a community based on their interest in a certain topic. This time, the invited speakers were Philip Zimmermann, a cryptographer, Ringo Ossewaarde, an associate professor in Public Administration, and Adam Henschke, professor in Ethics and Technology.

The value of technology

Zimmermann started off with ‘security and privacy are not opposite sides of the coin’. In his speech he gave examples of banks, military, businesses, and the government to accentuate his standpoint that ‘strong encryption enhances national security’. On the contradicting side, Henschke started his speech with ‘Stop information sharing to stop the bad guys’. ‘There is a push from national security why we need to cut back on privacy’, he emphasized. ‘Encryption technologies are used by terroristic groups to recruit and to get their message out.’

Furthermore, Henschke reasoned that ‘technology develops at the cost of humanity. The most advanced AI technology emerges from ministries of defense. The superior technology wins the battle.’ Chiara Poli, a student who attended the event, detected a rationale behind this: ‘Our society has developed technologies but failed in humanity. Putting such a high value on technology means missing out on the wisdom from history.' Ossewaarde agreed. ‘Big tech firms are anti-democratic. Their totalitarianism limits the historical consciousness. The technological development is highly connected to war development.’

Digital surveillance and democracy

Ossewaarde started his respective speech with a quote: ‘Digital surveillance is worse than Orwell’. This extrapolated Ossewaarde’s view of how digital surveillance is violating democracy: ‘In seemingly democratic states we also have totalitarianism. The West is not controlled by the government but by popularism and money’. ‘It is almost like mind control’, realized Roussi Roussev from the DesignLab DreamTeam.

‘The appeal of totalitarianism is not just fear, but comfort: Knowing what to believe while not having to think yourself’, concluded Chiara Poli. Ossewaarde agreed with her. ‘It boils down to our inability to live with chaos and our desire to have clear rules, morals, and directions.’

Considering the increase in digital surveillance and technologies, Roussi Roussev from the DesignLab DreamTeam asked the experts: ‘The government wants to surveil to have people act according to their laws simply because people feel they can be caught. So, can the government be the next god?’ Henschke waved that away. ‘There is a significant decline in the power of the state. Citizens think it is not the solution but the problem. They feel betrayed by non-kept promises.’ Ossewaarde agreed with him. ‘The authority of the government is limited, as there is widespread mistrust.’

Cyberphysical interaction: Who is responsible?

Henschke moved on to the next topic: the interaction between humans and computers. ‘Increased integration of cyberphysical innovations causes actual risks for the individual users. Hacking can cause serious individual damages, but also on a larger scale. Who do we make responsible when something goes wrong?’ asked Henschke while displaying the example of the Strava app fiasco. ‘You need a security expert when designing a product’, replied Zimmermann.

‘What happens in digital surveillance?’, asked a student who had experience in working as a security ‘How do we deal with profiling and judgement to find out if someone is actually malicious?’ ‘With digital surveillance no longer a human but a programmed algorithm judges based on statistics’, responded Ossewaarde. She disagreed: ‘Humans program these algorithms, so does the system judge or the human behind it? Nothing can ever be completely digitally judged, because it is created by humans.’