Interactive session with Don Sinclair
A recent session with Don Sinclair at the York University Digital Media lab: trying out an interface for interactive performance, we were wearing hats since the winter deep freeze was keeping that large space kind of chilly.
Don had made a Max patch to collect data from a Myo armband controller (https://www.thalmic.com/en/myo/) so that as movements are made along the yaw pitch and roll axes, three associated sound files can be mixed. With a band on each arm, it would then be possible to mix six files.
I had brought several sounds to work with the system. Shorter and more abstract electroacoustic sounds were unsatisfying in terms of gesture, feeling choppy. Two sets of sounds worked well together: in each case, one was a track which had environmental sounds accompanied by improvised vocalizations, my approach here influenced by Viv Corringham in her Shadow Walks, as well as the vocal work of Kathy Kennedy and Kok Siew Wai. This vocal-environmental track was effective when mixed with complex, active environmental recordings such as ice in a river at spring breakup or a spring field with abundant wildlife.
We developed performance strategies by getting feedback from two projections. One was a ball with yaw, pitch and roll mapped to x, y and z, in which the sonic effect of gestures could be understood intuitively by watching the ball. The second gave the values of yaw, pitch, and roll as a dynamic vertical bar graph reflecting each stream of data individually.
We decided that files in the range of 30-40 secs are most effective gesturally given our aims, so will think about combinations for next time. Also it seems that this setup will work well for vocal performance including spoken word and more abstract vocalizations. We have plans for next time to practise further, perhaps with two armbands, to articulate the range of motion more clearly, and work with different sets of sounds.