Shimon + ZOOZbeat : An Improvising Robot Musician You Can Jam With
ZOOZbeat is a gestural mobile musical controller that allows novices and musicians to improvise with Shimon, an autonomous robotic marimba player designed to create inspiring human-robot musical interactions that lead to novel musical experiences and outcomes. Shimon combines computational modeling of music perception, interaction, and improvisation with the capacity to produce melodic and harmonic acoustic responses through choreographic gestures. The robot, therefore, “listens like a human and improvises like a machine”.
Real-time collaboration between human and machine musicians capitalizes on the combination of their unique strengths to produce new and compelling art. This project aims to combine human creativity, emotion, and aesthetic judgment with the algorithmic computational capabilities of computers, allowing human and artificial players to build on each other’s ideas. A robotic musician brings computer music into the physical world acoustically, gesturally, and visually. Through the visual connection between sound and motion, an anticipatory embodied action approach, and a gesture-based actuation system, the robot can jam with humans in real-time synchrony without delay.
With ZOOZbeat, even non-musicians can interact with Shimon to enjoy expressive and creative access to music making and improvisation. Through a set of easily learned, intuitive gestures, ZOOZbeat players can generate musical material that is processed to fit the current musical context and entered into a looping sequencer. Users can then perform additional gestures to manipulate and share their creation. A “musical wizard” analyzes the user’s gestures and maps them to creation of meaningful melodic, rhythmic, and harmonic lines.
At SIGGRAPH Asia 2009, musicians and non-musicians can use this system to collaborate with a remote autonomous, improvisational robot.
Gil Weinberg
Guy Hoffman
Ryan Nikolaidis
Roberto Aim
Georgia Institute of Technology