Controllerism

  • Using phones, tablets, and computers to manipulate recorded elements in DAWs 
  • Investigating iOS and Android hardware potential for sound manipulation e.g. accelerometer, gyroscope 
  • Apps, such as L.K. and Touch OSC, for controlling Ableton (iOS and Android) 
  • Mapping phones to computer DAWs.  Routing iOS keyboards in desktop synths (expressionPad MIDI/Synth – freeware) 
  • Creating custom layouts for iOS and Android (e.g. Touch OSC) Using iPhone/ Android gyroscope to generate midi data

Today we looked at controllerism, focusing on using Ableton Live’s midi mapping.

Since getting an Ableton Push 2 I seldom find myself using Live without it. I enjoy the Simpler controllers for sampling as well as the Grid layout which makes it very accessible for someone like myself who doesn’t play keys and would otherwise get stuck in the same few scales.

The Push 2 has polyphonic aftertouch and pressure but does not have full MPE features such as slide. The Wavetable synth inside Ableton is the best for exploring this as it fully supports MPE meaning you can change the wavetable position for each note you’re playing depending on the pressure used (with each finger) allowing for more control over sounds.

Instead of using the Midi Mapping feature I often just make device racks that automatically come up on the push, allowing me to make my own instruments and presets.

Earlier in the year, I experimented with a VR headset and found it to be an exciting field to explore due to the specialisation of sound and the possibilities of interactability and controllerism. Using unity I created objects that emitted sound and could physically interact with them, moving them closer or further away. Some of these objects had physics and would fall to the ground, while others were frozen in space when let go of. This is something I would like to explore in the future, exploring how techniques made accessible through VR could be used in an artistic way as opposed to just novelty.

My laptop also has a touchscreen, something that I rarely use but have once before while experimenting with a Max patch. (as seen below) For my previous project, I mapped a foot pedal to Max so that I could control the length of the loops while interacting with a microphone, allowing me to be away from the computer itself.

https://photos.app.goo.gl/9GRgPLZzrnmLe1Er6

I have a small groove box/sampler/midi sequencer/controller called the OP-Z which has a built gyroscope for modulation yet I seldom use the feature. I enjoy using the devices as a sequencer and a tool for making music however I still have mixed feelings on it as a whole as for the same price I could have got a physical synth that would work alongside Ableton and Max.

I find it unlikely at the moment for me to use my phone as a controller.

For my hand-in, I aim to create a generative piece to explore creating a piece with no interaction on my part, other than pressing play/record.

Homework:

1. Develop one of the pieces begun in today’s session to be played in next week’s session.

Leave a Reply

Your email address will not be published. Required fields are marked *