Motion/Gesture tool for virtual foley and sound design

– First ideas and concepts

The research of project ideas for my personal project took a little longer than expected. For this project I was planning to learn new skills, as well as create something valuable to present and use for my career aspirations after the masters course. As one of my main interests lies within sound design and foley for linear and interactive media, the project should be within that area.

When thinking about foley work, it seems to me that the actual foley recordings have not really changed a lot since its invention in the 1920s. Of course equipment and technology shaped its development, but in essence the performance and recording of probs is still the same. This sparked an idea of bringing the foley performance into the digital realm, while still keeping the haptic feeling and body performance in the real world.

I came across an interesting project, an application to trigger sounds with a leap motion controller, which seems to have died off at some point and is not supported anymore. I was not really aware of the concept of leap motion to trigger sounds, but I already stumbled upon some gesture control via gloves controllers. A well known example would be the MiMU Gloves demonstrated by Imogen Heap in her NPR Music Tiny Desk Concert from 2019. In that performance she controls parameters of a sampler and multi effect application with hand gestures live while singing on top of it. The main function of the gloves in this performance is to create sound scapes, pads and rhythm. It seems to be used more like a backing instrument in combination with the artist’s voice.

As there are already applications to control foley and sound effects playback via audio input, for example human voice, I had an idea to use some kind of sensors to trigger and control foley sounds via motions and gestures. This could happen through some kind of gloves or motion capture via a camera, but is not only bound to that. Bringing this idea into my project, there could be an option to control the amplitude, pitch and playback speed of multiple samples simultaneously, as well as seamlessly blend between them. That would be one way to get different textures from multiple recordings and create something new out of it. As an option there could also be a granular synthesis option to get different sounds, depending on the situation and sonic vision. With the motion and gesture control this seems like an interesting and interactive way to play a sampler like a physical instrument.

For my project, the main goal would be to develop a software which could take gesture or motion control inputs and control certain playback parameters and effects to have a virtual foley stage controlled via physical movement. For the controller input I will research available controllers at FH Joanneum or IEM, as developing and building such a unit would go beyond the scope of this project. I still need to figure out many key factors involved in this project, but this seems like the direction I want to head towards.

The next steps would involve research on the development platform to allow this as well as how to incorporate the data output of the motion tool to realise the controls in the application. It would be also necessary to determine the scope of the application and what the prototype should be able to perform. As foley and sound effects cover a broad sonic landscape, concentrating on one specific type would make the definition of goals more approachable and lead to a quicker implementation of specific targets.

Leave a Reply

Your email address will not be published. Required fields are marked *