‘Turn any movement into music’

| Michaela Nesvarova

UT student Robin van Soelen (25) has created an embodied music controller, which - using motion sensors and machine learning - allows you to make music through movements. ‘I wanted to make electronic music concerts more engaging.’

‘As part of my Master thesis, I created a music controller that you operate by moving your body,’ says Van Soelen, who graduated Interaction Technology at the UT a couple of weeks ago. ‘The original idea was to find a more engaging and novel way to produce music.’

The controller allows you to make music using any chosen movement or gesture. ‘You press a button, you make a movement of your choosing and, thanks to machine learning, the program recognizes the movement as one of the pre-trained gestures,’ explains Van Soelen. ‘Then once you have made the gesture, my program sends a signal to the music software. It then plays the pre-recorded sound based on your gesture. This way you can turn any movement into music.’

Robin van Soelen demonstrates the embodied music controller in this video. 

The controller is designed for electronic music performances. ‘I was looking at how electronic music concerts lack transparency. Artists hide behind buttons and tables, not allowing the audience to see what they are doing,’ says the recent graduate. ‘I think this embodied controller increases the engagement from the audience and it makes it more fun for the artists. On top of all that, I really like music. I play many instruments and I’m into music producing as well. So this project was a lot of fun for me.’

With his studies now completed, Van Soelen doesn’t plan to continue working on the music controller. ‘I’m not sure if there is much room for it on the market,’ he says. ‘Now I’m looking for the next challenge – ideally something musical and creative.’

Stay tuned

Sign up for our weekly newsletter.