Participative Real-Time Graphics
Cross-media interactive experiment for performance shows. A blend between different techniques and technologies featuring generative animations.
With this project I wanted to show how computer graphics are not limited to Visual Effects and Games. I explained how we could be feeding input data in different ways:
- Live MIDI from the artist: Feed values into shader using a VJ production package.
- Raspberry Pi orientation using a Sense Hat: TCP sockets.
- Tweets matching some criteria (#hashtag): image extraction and body parsing.
The idea was also to break the conventional rules that when people go see a show they are passive, they receive the product, instead I proposed some ways you could engage an audience to participate in the performance.

The project was featured on CinemaJS, sharing stage with talks related to multi-player games using web technologies. You can find the slides I used in the presentation on here.
People would send a tweet starting it with a specific hashtag and the body of the tweet could be parsed and affect the animation. Also the profile image of the user tweeting would be extracted and thrown into an OpenGL fractal happening on GLSL fragment code using sphere tracing (also called raymarching).

Distorted profile image of the user who tweeted

Original image extracted from the user’s Twitter profile.

Distorted version of my profile image.

Image extracted from my Twitter account.

Diagram showing how data flows between the frameworks / applications.
The project also shows how to make an FFGL plugin, which basically wraps OpenGL shaders and exposes the uniforms to the most used VJ (Visual Jockying) applications, used in production for live music events. It compiles all the code into a binary that the user can drag and drop to their production package and use on their raves by mapping MIDI, beats per minute, frequency thresholds or anything the package provides to the uniforms.
@ramonblanquer very cool #cinemajs fractals! pic.twitter.com/Qc3BFjmyf1
— Bernat Espigulé (@bernatree) January 26, 2017
Want to know more about the project?