‘Diversity is a good scientific practice’

| Michaela Nesvarova

‘Machines that we interact with have power over us,’ says UT researcher Cristina Zaga. ‘Just think of facial recognition systems that have troubles recognizing dark-skinned faces, the inherent bias it has. That is why we need technology that represents the entire population.’ The scientist is leading project DEI4EmbodiedAI (Diversity, Equity, and Inclusion for Embodied Artificial Intelligence), which is officially kicking off today.

As its name suggests, the 4TU consortium is working towards more diverse and inclusive embodied AI, meaning any systems that display intelligent behavior, including our smart appliances, robots and home systems such as Alexa.

diversity and ai

DEI4EmbodiedAI, funded by 4TU (the federation of the four Dutch universities of technology), is a research collective of academics from the University of Twente, the Delft University of Technology, the Technical University of Eindhoven and Leiden University.

It is led by Cristina Zaga, professor at Human-Centred Design Group (Design and Production Management department) and a researcher at the The DesignLab at the University of Twente.

The project aims to critique the existing problematic norms and provide tangible resources for practicing diversity, equity and inclusion (DEI) in developing embodied AI. It will involve a series of workshops, each focused on a different aspect of DEI, such as gender, race and ableism. The first findings and practical guidelines should be presented by the end of 2021, after which the scientists hope to continue the research within a bigger project.

Power of design

‘All scientists and designers bring their own perspectives, values and biases into their work,’ says Zaga. ‘Yet, the technology we use is based on its designers. When you interact with your home system, for example, the interaction is almost humanlike. It makes you think there is some reciprocity of intelligence. And so if the system misgenders you or mislabels you, it can make you feel excluded or even stigmatized. The choices of data we use and the behaviours we train these machines to exhibit, therefore have a big power. Designers essentially have the power to decide what is the norm and who can flourish as a human being.’

Biases

There are many examples of technology failing to represent the general population, explains Gijs Huisman, UT alumnus and an involved researcher from the Delft University of Technology. ‘Embodied AI is based on models fed by specific data – chosen by the designers. These models are based on the lived experience of the researchers, but not necessarily the users. There are well-known examples like robots modelled based on a submissive woman, which perpetuates biases towards women, or the facial recognition that doesn’t recognize dark skin as well as lighter skin.’ The examples don’t end here, though.

‘Technology is usually designed for full-abled bodies,’ continues Huisman. ‘It assumes we can all hear, see, move the same; but that is of course not the reality. Devices are also often designed for a specific group, but not with them. For instance, there is a glove that can translate sign language into spoken word. This piece of technology has not been embraced by the deaf community. They already have a language that they can use very well. Who is to say that we – the ones who can hear – shouldn’t learn sign language instead?’

Hands-on tools

The 4TU project was set up to reflect on these design practises and improve them. ‘We have two goals,’ says Zaga. ‘Firstly, we want to understand what the current situation is. We will work with all relevant stakeholders, practitioners and designers, as well as marginalized groups, and document all the perspectives we find. Secondly, we want to design hands-on tools that can help researchers to create more diverse and inclusive AI.’

‘For us, the project is not only about social justice,’ stresses the UT researcher. ‘It is about providing science that represents the lived experience of the general population  – for society, with society. We need to bring knowledge that reflects a diversity of races, gender identities, bodies, ages, cultural practices and so on. The focus on diversity, equity and inclusion is actually also a good scientific practice.’

Stay tuned

Sign up for our weekly newsletter.