Touch MI

A Touchless Musical Interface

Making music more often than not involves physical touch. While the connection between artist and instrument can be very special, it can also hinder one's expression. Touch MI is an exploration of TOUCHless Musical Interfaces inspired by an application of machine learning called pose estimation.

Pose estimation is the task of extracting the position of a person from a photo or video. The resulting information about the location of key points (e.g., wrists, shoulders, nose, or eyes) is used to control musical parameters. For example, the vertical position of the left hand can be assigned to control the pitch of a synthesizer, while the horizontal position translates its volume. The possibilities for these mappings are endless and not limited to instruments.

Touch MI works with any camera (e.g., webcam or phone), which eliminates the need for additional hardware. Furthermore, it is designed to communicate wirelessly over the web and thus allows for a whole orchestra of people from all around the world to play simultaneously.

This video is a short demonstration of the first prototype. The left hand controls the chords (y-axis) and the filter (x-axis). The right hand plays the melody (y-axis) and controls the reverb (x-axis).