Hand
Hand is a structured-improvisational piece using AI machine learning, generative algorithms, and live performance. It was created as part of a university group project consisting of four members: myself, Sam R, Jamie and Mati.
About the project
Our primary artistic goal was to explore the possibilities of combining AI and live performance. We decided to centre the piece around the theme of hands, and how they have been used in the past in relation to music. Hands are a universal symbol of creativity and expression, used to create, destroy, and communicate, so there are plenty of ways one could interpret the theme.
This is the final post in a series about this project. Here I’ll be discussing my final contributions and changes, and a reflection of the overall outcome. For important context, and the methods and key decisions made throughout the process, read the previous posts first.
Listen
You can listen to the final performance of Hand here. Although it is a visual piece, to protect the privacy of my group members I have chosen to include only the audio. (Examiners will have a copy of the video, of course.) The piece was performed live on the 13th of November, 2024.
Final changes
I wanted my part to be more performative. I was leading the piece visually with the on-screen text box and AI letter detector, but I wanted to contribute more to the actual music than just a plain drum loop.
So, I added some effects to the drums: a low-pass filter, reverb and distortion. I controlled these parameters using the knobs on a MIDI controller, and for example in slower sections I cut the high frequency content and increased the reverb. This made the drums more dynamic and interesting to listen to.
The Ableton track looked like this. You can see that I prepared 5 knobs to be mapped to MIDI controls, using macros on the Instrument Rack.
See inside the Max patch:
Breakdown of the piece
Here’s a breakdown of the structure, with timestamps:
-
Introduction (0:00): Mati introduced the piece by playing a melody on a synthesiser, that had its parameters controlled by his hand using HandPose-OSC. Around 45 seconds in, I added drums and the others slowly added their parts too.
-
Give me your HAND (1:20): I typed the phrase “Give me your…” into a text box which was sent to Sam R’s program to be converted into notes. Then in my AI letter detector’s interface I drew ‘H’ very slowly and meticulously.
Every second, the model sent back what it thought I had drawn, and I kept going until it got it right. When it was finally correct, everybody slowed down what they were doing for a moment. I cleared the screen, and I continued on with the next letter while everyone sped up again. This process repeated until the phrase ‘HAND’ was complete.
While this was all happening, the drum loop was being generated based on what the AI was seeing, and my mouse movements were triggering notes on the Guidonian Hand. Look, it’s me!
Screenshot taken from 3:24 of the recorded piece.
Left: the AI letter detector’s interface. Right: the Guidonian Hand.
-
Interlude (5:15): After that main section was complete, we returned to a quiet section where Mati played a melody on the synthesiser again. This time I was controlling the drums though.
-
I put my hand on my heart <3 (6:10): While Mati continued playing, I returned to the text box to subtly modify the sounds that Sam R’s program was making. I sent “I”, “put”, “my”, “hand”, “on”, “my”, “heart<33”, then finally “hand” over and over.
-
Build-up (8:25): Sam R updated his program to start saying the messages I sent, using a text-to-speech program. All four of us began to play our parts as loud and fast as we could.
In this final build up to the climax, I removed the filter and reverb on the drums, and slowly increased the tempo.
Screenshot taken from 8:53 of the recorded piece.
From left to right: Jamie, Sam R and Mati playing their parts during the build-up.
- Climax (9:10): When the drums couldn’t go any faster, everybody quickly stopped playing their parts and I slowly brought the filter and reverb back in to ‘fade away’ the drums. In the end, all that was playing was the text-to-speech program saying “hand”, and a single note on a piano.
Reflection
I felt a little uneasy about the group project at first because I was unsure how we would all fit together. I particularly like how much myself and Sam R’s parts interacted with each other, and how we were all able to improvise while still following a structure that I lead on the screen. I think the piece was successful in delivering what we had set out to do and I’m happy with it.
That being said, I definitely could have done more with the drums. If I were to do another project like this, I would like to instead focus entirely on working with synthesisers and effects, and let someone else take the lead on the visuals.
As always, thanks for reading.