This project explores the intersection of robotics, sensory input, and visual expression through a custom setup involving a KUKA KR 6 R900 robotic arm. A microcontroller integrated with an LED light and a sound sensor was mounted on the robot. As the robot followed a pre-programmed toolpath generated via KUKA|prc, the light color dynamically shifted in response to real-time ambient sound, processed through CircuitPython using the Mu Editor. The performance was documented through long-exposure photography, resulting in visual traces that captured the interplay between human presence (sound), robotic motion, and sensor feedback. The outcome is a poetic visualization of machine responsiveness, emphasizing the potential of interactive systems to translate environmental input into expressive form.
Back to Top