Is code synesthetic? On Quayola’s Transient by Marcus du Sautoy
Is code synesthetic? It is a question that emerged for me after experiencing Quayola’s new work Transient. The idea behind the piece is to exploit the fluidity that code provides between three worlds: the visual canvas, the sonic realm of the piano and the functional manipulation of data. What one experiences when encountering the piece is multiple ways for the human body to experience the dynamics of the algorithms that are being manipulated. One is both being allowed to see and hear the digital data as it is being played.
Our embodiment means that we are engaging with these two worlds via different parts of our bodies: our eyes and ears. And these inputs are similarly being processed in different parts of the brain. But what is curious to me is that the source of both these outputs is a world of numbers. It raises the question of whether an AI could tell just through encountering numbers if the output is meant to be sonic or seen.
Several composers have talked about experiencing synaesthesia, a neurological condition where different senses get scrambled. For example, the French composer Olivier Messiaen would experience a rush of different colours connected with particular combinations of notes. He called these colour chords. Combinations of chords would create a rainbow effect or “stained glass window”. His chromatic chord consisting of all 12 notes of the chromatic scale he described as follows: two overlapping areas below: white diamond, glints of light blue and purple moon at the top: the 4 additional notes add a thin brown leather band, degrading to white.
It seems that the way that the sonic world is being encoded in Messiaen’s brain shares something in common with his visual encoding of colour, so much so that the one triggers the sensation of the other. Given that we are changing our pictures and music into 0s and 1s in the digital world, the same question might arise for code? By looking at digital data can you tell if this is a picture or music?
What Quayola Studio have successfully achieved is the development of a custom-made software that allows them to control both images and sound in real-time as if the software is creating a synesthetic code. It is the same code that is driving both the visual output in the performance as well as the music coming from the pianos.
One fascinating question that this raises is how each sensual realm provides access to the abstract world that it is interpreting. Do visuals provide a better insight into how the algorithms are behaving? Or is a two-dimensional canvas not as rich as the multi-dimensional universe of sound to capture the complexities at play?
But the question sparks an exciting opportunity that Transient is exploring. By taking a common language of mathematics underpinning both music and image, one has the intriguing prospect of using this common language as a way for sound and sight to communicate with each other, learn from each other, and stimulate new ideas. Machine learning algorithms to create new visual art are most often given visual data to learn on. But what if you gave the AI music to train on but asked it to produce visuals from that learning?
What we are seeing and hearing in Transient is also the opportunity that code provides to democratise art. Neither performer is a concert pianist. But they can manipulate and transform algorithms and data like digital maestros. They are playing the numbers. By translating this into the robotic interaction with the pianos they are able to play the instruments via the medium in which they have expertise.
Algorithms today can be considered fundamental components of society, write Quayola about their work. But these algorithms are becoming so complex, ever-changing and mutating through machine learning that we are losing control of how they are making their decisions. By asking the algorithm to create art, both musical and visual, the output gives a chance to experience the hidden life of the code. As the psychologist Karl Rogers articulated, human creativity was probably always meant to be our best tool for exploring the mysteries of our own inner worlds, our consciousness and those of others.
There is increasing evidence that allowing code to express itself through painting or music might similarly give us a chance to better understand the inner nature of how these complex algorithms are working. Through this collaboration between human and machine, Transient allows an audience to access the abstract world of the algorithm via the tangible world of sound and vision.