Participative Real-Time Graphics

Cross-media interactive experiment for performance shows. A blend between different techniques and technologies featuring generative animations.

Like this post2 Launch

With this project I wanted to show how computer graphics are not limited to Visual Effects and Games. I explained how we could be feeding input data in different ways:

  • Live MIDI from the artist: Feed values into shader using a VJ production package.
  • Raspberry Pi orientation using a Sense Hat: TCP sockets.
  • Tweets matching some criteria (#hashtag): image extraction and body parsing.

The idea was also to break the conventional rules that when people go see a show they are passive, they receive the product, instead I proposed some ways you could engage an audience to participate in the performance.

The project was featured on CinemaJS, sharing stage with talks related to multi-player games using web technologies. You can find the slides I used in the presentation on here.

People would send a tweet starting it with a specific hashtag and the body of the tweet could be parsed and affect the animation. Also the profile image of the user tweeting would be extracted and thrown into an OpenGL fractal happening on GLSL fragment code using sphere tracing (also called raymarching).

Distorted LadyOriginal LadyDistorted MeOriginal Me

Distorted profile image of the user who tweeted

Original image extracted from the user’s Twitter profile.

Distorted version of my profile image.

Image extracted from my Twitter account.

Diagram showing how data flows between the frameworks / applications.

The project also shows how to make an FFGL plugin, which basically wraps OpenGL shaders and exposes the uniforms to the most used VJ (Visual Jockying) applications, used in production for live music events. It compiles all the code into a binary that the user can drag and drop to their production package and use on their raves by mapping MIDI, beats per minute, frequency thresholds or anything the package provides to the uniforms.

Want to know more about the project?

Check out the slides and play with the demos they contain.
The report that carefully explains all the work can be found here.
Source Code
All the source code I used for the project is accessible on this GitHub repository.
Video explaining project and workflow.
Sending data from Raspberry Pi to OpenGL shader through TCP Web Sockets.
Fractal patterns shown by an FFGL plugin.
People would populate the fractal with their image profile when they tweeted starting with #cinemajs.
Go top