Visual performances such as those seen in exhibitions, museums, and events should take advantage of advanced rendering technology to create the most immersive experience possible. This project achieves immersion by merging three principles: reactivity, visual quality, and interactivity.
Reactivity adds dynamism and links visuals with the music, as opposed to the classic approach of pre-recorded content. Most of the event can be left to improvisation (through different input devices like MIDI controllers), letting the performance flow seamlessly with the music.
Visual quality (landscapes, colors, textures, and shapes) is mostly created procedurally in real time. There are no pictures or recorded footage, so every aspect of the imagery can be changed as needed during the show.
Finally, the audience can interact with the show, making them feel part of it. The interactivity is achieved with mobile devices, 3D cameras, and standard cameras.
Each show is unique, the artist can even write code in real time during the event.