FlowInteractive visual effects by Karl Sims, 2018
Flow is on display in the CSAIL lobby of the Frank Gehry-designed MIT Stata Center in Cambridge, Mass. and at the National Museum of Mathematics in NYC. A Kinect depth sensor detects the shapes of people in front of the display, and allows them to manipulate the simulations with their gestures and motions. The exhibit cycles through 10 effect modes with different simulations and visual effects, each lasting about one minute. The cycle of effects varies from day to day, so your experience may differ depending on when you visit. Photos from a subset of these effects are shown here. This Fluid Flow Tutorial provides some technical details on the real-time fluid simulations used for this exhibit.
|
In this effect, participants see a mirrored image of themselves augmented with virtual colored ink that squirts from their hands and other detected protrusions. |
This demo video shows a short clip of each effect. For best results play at Full screen with Quality setting at 1080p60HD
A spectrum of fluid layers with different densities is mixed up by your movements. Gravity affects each density differently and eventually pulls them back to their stable layered state. |
This "lava lamp" effect puts you inside a simulation of upward moving orange blobs colliding with downward moving blue blobs. Your motions can mix up the colors, but viscous forces slowly merge similar colors back into larger shapes. |
Inspired by van Gogh, this swirling paint effect is generated from your image and motions. Particle systems are used to render brush strokes, and their dynamics are controlled by a fluid flow simulation. |
Moving hands paint the screen with rays of colored light. |
In this effect, your motions generate waves in a simulated liquid that propagate across the screen and reflect against the sides. |
To the right, the shapes of participants are injected into a fluid
flow simulation, causing their colors to swirl away based on their
motions. The colors slowly change while they flow to create a range
of hues.
Karl Sims developed the simulation and visual effects software for this exhibit using C++, OpenCL and OpenGL, which runs on a Linux computer with an NVIDIA graphics card. Jesse Gray of IF Robots developed the Kinect camera and depth sensor interface. The original version of this exhibit was commissioned by CSAIL, the MIT Computer Science and AI Laboratory. Thanks to the CSAIL team including Una-May O'Reilly, Charles Leiserson, and Daniela Rus. |
Rays of blue light are generated from liquid shapes that swirl from the motions of users. |
Back to other work by Karl Sims
© 2018, Karl Sims, All rights reserved.