Group project with // Bill Barham, Khalid Rafique.
Create a sound performance tool using motion detection to control different instruments/sound files. As a person walks in-front of the camera they are assigned a individual instrument, as they dance and move they control different elements of the sound.
The overhead blob detection will map the sample location and playback speed/pitch of the audio. Griding off the camera view into sections will allow a certain area to trigger short sound files. For the front camera we will use face detection to control some of the other elements of the audio file, experimenting with the tempo, volume and reverb. Allowing for fluidity with an element of randomness we will program certain triggers into the installation to activate specific functions such as sample locations.
We hope that people without any specific notions of what the different areas of tracking elements are, will have an intuitive feel for the instrument. Aiming to Illicit play, encouraging participation and interaction between the instrument and the users with each other.
The final piece won’t include the videos beside the camera, this is just to show whats happening.