Disrupting the conventional use of TouchDesigner, a node-based developer platform widely used for video- and signal manipulation, by transforming it into an interactive performance tool that generates both sound and video. Users engage with it organically, without predefined contexts, breaking barriers in the exploration of sounds and visuals.
Touchdesigner is a node-based software in which individual nodes have roles. The nodes have
parameters that can be mathematically manipulated. Together in connection with other nodes,
it executes logic. TouchDesigner networks are a little society with folders and boxes, living
algorithms supporting existences together. As they are presented as visual elements rather than
programming languages, we also visually perceive the system's structure.
While we care for the final execution of the ‘image’ or the mechanical ‘logic’, we treat the raw
nodes as a ‘backend’, the part that is not supposed to be shown.
However, without the nodes,
the work wouldn’t exist.
What if these nodes themselves are as important as the outcome you present in the end? How
would it transform the way you compound the tool?
APPROACH:
After a brief Introduction into TouchDesigner and the Announcement of the Assignment, the objective
was to choose a Node in TouchDesigner that would spark our interest so it could be investigated
further. The Idea was to deepen the research and extrude it from the boundary of the program itself. Is
there anything that provokes curiosity to explore it further than the network of the program? Are there
connections to other Topics like: history, biology, philosophy, science, architecture or for example
engineering?
I’ve collected my thoughtstream and inspiration process digitally on the platform Arena, click on the eyes below:
Inspirations were: the cycle of the sun going up and down everyday, waterdrops which are falling as soon as the droplet is too heavy for its surface structure to contain it, flowers that bloom and close or in a
deeper scientific context sonic frequencies. In our everyday life examples are things like: traffic
lights, our metabolism or oil pumps.
The Node I’ve chosen in TouchDesigner was the “LFO” Node.
Since for me most of the area of use in TouchDesigner are tied to visual outcomes and video
manipulation I was thinking about using it for something that it is not usually bound to, which was
sound or sound creation.
It reminded me of circuit diagrams that are used when building synthesizers:
I’ve investigated the Node: “LFO” which stands short for “Low Frequency Oscillator”, which is a
process where a signal alternates between its starting point and its destined value and thus creates
motion. This process happens everywhere since the creation of mankind which was an interesting
starting point for me. It occurs naturally and without the influence of humans but is also widely used
in systems that we are bound to in our everyday life.
My first approach was to take a moving Graphic Image (GIF) and alter it in a way that it becomes a
sound controlled Installation.
After Investigation of the Node I’ve created a sketch which starts with 4 oscillators which get filtered
and altered so they create a alternating frequency with dynamics which alter its visual and sonic state
in an interesting way.
The Idea behind it was that I wanted to
motivate the User that is interacting with the Installation to just try things out, to move knobs and
faders without understanding the deeper sense of the sonic modifications of the sound having only a
basic idea of the haptic interaction. Often sound- or music-production is tied to having deep
knowledge of its matter and can be quite intimidating for people that never have been into the topic.
The Idea was to break free of this thought so I mapped a MIDI-Controller to internal parameters that
would trigger different settings inside of TouchDesigner and let you control the visual outcome by still
altering the sound.
After consultation with So-Yun we together came to the conclusion that I have in the end used
TouchDesigner against the purpose which I originally intended it to use which was video
manipulation. I’ve dug deeper into the the LFO Node to find that its own visual Interpretation of the
frequencies is actually quite aesthetic in its own, even though it is merely a visual representation of the
signal flow. After this realization I’ve decided to compose a sketch that is visualizing the sonic
modifications visually by displaying merged waveshapes to find that it created beautiful and
aesthetically rich outcomes and had more of an relation to my original Idea. It took a couple Iterations
to get to my final outcome.