Participative Real-Time Graphics
Cross-media interactive experiment for performance shows. A blend between different techniques and technologies featuring generative animations.
With this project I wanted to show how computer graphics are not limited to Visual Effects and Games. I explained how we could be feeding input data in different ways:
- Live MIDI from the artist: Feed values into shader using a VJ production package.
- Raspberry Pi orientation using a Sense Hat: TCP sockets.
- Tweets matching some criteria (#hashtag): image extraction and body parsing.
The idea was also to break the conventional rules that when people go see a show they are passive, they receive the product, instead I proposed some ways you could engage an audience to participate in the performance.
People would send a tweet starting it with a specific hashtag and the body of the tweet could be parsed and affect the animation. Also the profile image of the user tweeting would be extracted and thrown into an OpenGL fractal happening on GLSL fragment code using sphere tracing (also called raymarching).
The project also shows how to make an FFGL plugin, which basically wraps OpenGL shaders and exposes the uniforms to the most used VJ (Visual Jockying) applications, used in production for live music events. It compiles all the code into a binary that the user can drag and drop to their production package and use on their raves by mapping MIDI, beats per minute, frequency thresholds or anything the package provides to the uniforms.
— Bernat Espigulé (@bernatree) January 26, 2017
Want to know more about the project?