Exploring the perception of visual and sonic phases by making use of the dynamics of a waterwheel.
APPROACH: Phasing is an audio effect that occurs when two identical audio signals are played simultaneously, but one of them is slightly delayed or offset in time. As these signals play together, the phase difference creates a shifting, oscillating sound. The frequencies of the signals interfere constructively and destructively, causing the perceived sound to move or "phase" over time. Phasing is commonly used in music production and electronic music to create dynamic and evolving sound textures.
This effect first was discovered by Terry Riley by experimenting with tape loops and delays and later on was further researched by famous pioneer of sound Steve Reich. In his works Piano Phase and Pendulum Music you can hear how he makes use of the effect.
In this project i tried to visualize the sonic effect and quantize it into the visual realm. In sound-design, this effect can create really interesting textures and is a interesting tool to fragmentize sound samples.
Below you can find a two dimensional visual representation of visual phasing. All cones circle around the square in the middle, all with a small offset in speed. Each of the cones will eventually cross eachother and at some point, depending on the phase, will end at their starting point, the time in which this phase cancels out can be calculated.
If you combine the visual phasing effect with the audible effect you get a really interesting experience which unfortunately does not unfold as well in a video through the limitation of the stereophonic 2-channel audio.
I designed a 3-dimensional scene with inspiration of the 2-dimensional sketch shown above using the Unreal Engine platform to implement a VR-Headset which lets you experience the phasing effect in realtime. Each of the orbs in the scene is tied to a sound source which is slightly pitched up. While the audio slowly moves out of phase, the same happens visually with the orbs.