Pamela Z is a composer/performer and media artist who works primarily with voice, live electronic processing, sampled sound, and video. A pioneer of live digital looping techniques, she processes her voice in real-time to create dense, complex sonic layers. Her solo works combine experimental extended vocal techniques, operatic bel canto, found objects, text, digital processing, and wireless MIDI controllers that allow her to manipulate sound with physical gestures. In addition to her solo work, she has been commissioned to compose scores for dance, theatre, film, and chamber ensembles including Kronos Quartet, Eighth Blackbird, the Bang on a Can All-Stars, Ethel, and San Francisco Contemporary Music Players. Her interdisciplinary performance works have been presented at venues including The Kitchen (NY), Yerba Buena Center for the Arts (SF), REDCAT (LA), and MCA (Chicago), and her installations have been presented at such exhibition spaces as the Whitney (NY), the Diözesanmuseum (Cologne), and the Krannert (IL). Pamela Z has toured extensively throughout the US, Europe, and Japan. She’s a recipient of numerous awards including the Rome Prize, United States Artists, a Robert Rauschenberg Foundation residency, the Guggenheim, the Doris Duke Artist Impact Award, Herb Alpert Award, an Ars Electronica honourable mention, and the NEA Japan/US Friendship Commission Fellowship. She holds a music degree from the University of Colorado, Boulder.
I recently got my hands on a Quest 2 VR headset and and find that the hand tracking techniques that Pamela Z uses could be used, as well a more flexible experience of interactable objects in a virtual, or augmented reality. I believe that the technology is more versatile than the controllers that Pamela uses due to the cameras ability to use gestures with individual finger tracking.
Perhaps it is possible to assign a ‘track’/ instrument to each finger, what could then be selected by touching the said finger and thumb together.
I have the experience in Unity to experiment with this in terms of spacialisation, creating simple objects that emit sound and having the ability to move them around in a space.
I have recently been trying to learn Max MSP, and have come across a package that works with VR (created by Graham Wakefield). I will try to experiment with this.