Project Name : ARTKOREALAB ≪LOOPING IN THE BOX≫ Music By Michan Lee ≪GLIDE≫
Task Scope : Interactive Design
Category : Non-Commercial
Completion : Jan. 2026
This project is an Audio Reactive piece that separates the six stem files constituting the music (Main Riff, EGTR, Bass, Pad, Drums, Strings) and converts each audio signal into visual variables. Avoiding the simple substitution of volume changes for size changes, it was designed so that the signal of each instrument acts as a physical computational value that moves or mixes the pixel coordinates on the screen. Through this, a system was established where auditory data directly controls the movement of visual data. Data processing was executed at the CHOP Operators stage in TouchDesigner. The audio waveform of each stem file is analyzed by frequency band through an Audio Spectrum, then converted into a single numerical value via Analyze (RMS Power) and Filter. Striking sounds like Drums and EGTR instantly reflect numerical fluctuations, acting as triggers that adjust the intensity of screen distortion, while sustaining sounds like Pad and Bass are mapped as base data that continuously shift the entire coordinates of the screen. The core structure of visualization is the 'Feedback Loop' system in the TOP Operator stage. By connecting Feedback TOP and Displace TOP in a recursive structure, the current frame's image is accumulated onto the next frame, implementing an effect where pixels leave motion trajectories. The audio data extracted from CHOP is connected to the Displace Weight parameter, controlling the distance pixels are pushed based on volume in real-time to generate fluid flow. For texture, Slope TOP was used to express shading. Brightness variation amounts of the flowing image evaluate shading information, which is blended with the original image via Hard Light mode to add topographical curves and elevation to flat graphics. To prevent color interference during this shading calculation, the image is converted to black and white so that only shape and contrast are evaluated. Finally, colors were mapped using the color data of the artwork ≪Glide≫ via Lookup TOP, substituting designated color ramps according to the brightness levels of the simulated black-and-white fluid video. Through this, the color info from the original artwork remains undistorted amidst the complex movements of the feedback loop.