top of page

Play-Tester

ezgif-1-a02646d150c7.gif

TEAMLAB INTERNSHIP PROJECT: FLOCKING SIMULATION

June-September 2019

After completing my senior year of college, I had the opportunity to do a summer internship in Tokyo at a company called teamLab, which is best known for its large scale digital art installations. As a software engineering intern on the team that creates the visual for teamLab's exhibitions, my project was to prototype an interactive projection mapping installation simulating flocks of over 15,000 objects that responded when audience members moved or touched a wall within the installation space. I ultimately used the software I wrote for controlling the flocks to create three different installation prototypes and successfully tested them in teamLab's borderless space.

The foundation of the flock simulation is the Boids algorithm, but I made modifications so the objects could be constrained to meshes, respond to user interaction and have more lifelike behavior. The flock simulation was completed in Unity using GPU Computer Shaders to run at fast enough speeds and I used Houdini and Substance Painter to create the models. Additionally, I used OSC and KlakSpout to communicate with the installation space's sensor and projectors, and created a simple UI for controlling the simulation's behavior in real time.

birdFlock.jpeg

INITIAL CONCEPT

I was given the freedom to select my internship project, however, as I had some experience with Houdini already, my team requested that I experiment with using Houdini in combination with Unity. As a result, I started by brainstorming ideas that could make use of procedurally generated models and terrains. I ultimately decided to do an interactive flocking simulation because I thought it would scope well to the timeframe of my internship while still having the potential be technically challenging and visually interesting.

Maya Ramsey - Final Presentation.png

FLOCKING ALGORITHM

I started by implementing a basic version of the Boids algorithm, which is used to simulate the flocking behavior of birds. It uses three simple rules to create remarkably complex behavior when applied to a large number of objects:

  • Cohesion: Apply a force to each flock member moving it towards the average position of nearby flockmates

  • Alignment: Apply rotational force to each flock member to steer it towards the average direction of nearby flockmates

  • Separation: Apply forces to each flock member to move it away from any flockmate it is getting too close to

I also added some extra logic to steer the flock members away from obstacles and towards targets when they are within a certain distance from them. 

ezgif.com-video-to-gif (2).gif
Untitled presentation (4)_edited.png

CONSTRAINING FLOCKS TO SURFACES

I wanted the flock members to be capable of both flight and running along the surface of larger objects so I modified the Boid algorithm to allow this behavior. All flock members randomly choose to start flying or, if they are near a surface, land and move along the surface. When on a surface, flock members will use the nearest vertex normal to determine what plane they should move along. Flying flock members treat surfaces as objects they must avoid and extra movement logic was added so they could slow down and land in a realistic fashion.

unnamed (2).png
gradient.png
Untitled presentation.png

15K + OBJECTS USING THE GPU 

Simulating large flocks requires many calculations to be run each frame, which I realized the GPU can do more efficiently. Therefore, I learned to use Unity compute shaders to calculate the new positions of and render the flock members.

Upon starting a scene, surface buffers are first populated with vertex data extracted from the scene's surface meshes. Then flock member positions and velocities are randomly generated and the flock member data buffers are populated with that information. 

 

During each frame, flock member positions are updated based on the forces applied during the previous frame, and their vertex positions are then updated based on the new mesh positions.

 

Next forces are applied to each flock member based on their nearby goals andobstacles. Since each flock member only calculates forces based on objects within its interaction radius, I divided the scene up into zones to reduce the number of calculations that has to be done for each flock member. Each zone is a box of a size determined by the flock members' interaction radius and linked lists are created of the flock members and vertices in each box. Since the boxes are guaranteed to be larger than all flock member's interaction radius, each flock member only has to calculate forces for object within the box and surrounding boxes.

Finally the updated flock members positions, sizes and orientations are passed to the instance shaders so that the flock members can be rendered

ezgif.com-gif-maker (2).gif
ezgif.com-gif-maker (3).gif

AUDIENCE INTERACTION

To receive data from the installation space's sensors, I used a package for using Open Sound Control in Unity. This data could then be used to do things like make the flock members follow viewers, change shape or multiply in response to a viewer moving and/or touching a wall. 

 

I also used the dimensions of the installation space to position cameras and create render textures for each wall that I sent to the space's projectors using KlakSpout. 

ezgif.com-video-to-gif (7)
unnamed (5)
unnamed (6)
unnamed (7)
ezgif.com-gif-maker (1)
ezgif.com-video-to-gif (5)

VISUAL DESIGN

Once I was happy with the generic flocking behavior, I created three different scenes to prototype the experience. The first was inspired by the idea of an asteroid belt orbiting a planet and the flock members would follow audience members when they got near any walls. The second was inspired by schools of fish and the flock members would swim towards audience members and then merge together into larger forms when an audience member touches a wall. The third, and most complex, had a flock of large objects with flocks of smaller objects running along their surface.

The models were all procedurally generated in Houdini and the textures were created in Substance Painter. I also experimented with the Houdini plugin for Unity so that models could be created through the Unity editor. I attempted to make the simulation capable of procedurally generating models in real time, but due to time constraints primarily use pre-generated models.

unnamed (4).png

CONTROLLING THE SIMULATION

teamLab's installations often need to be adjusted in response to the particulars of the space and number of audience members so I also created a simple UI to adjust things like the number of flock members, how strong the forces they experience are and the visual appearance of the simulation in real time.

TESTING THE INSTALLATION

During the final week of my internship I was given the opportunity to test the installation in the teamLab Borderless space. I am very proud that the simulation was able to run successfully and respond to audience interactions while being projected on multiple walls. Given more time I would have loved to fine tune the simulation to work as effectively as possible in the space, but I am satisfied with what I produced as a proof of concept.

bottom of page