Flightphase teamed up with The Light Surgeons to implement bespoke software for ‘Voyagers’, a new permanent audio-visual installation for the National Maritime Museum in Greenwich.

‘Voyagers’ features a wave-like projection surface stretching the full width of the room. Made up of 26 triangular facets and a Puffersphere spherical projector placed at the end, the architecture acts as a canvas for our installation.

The challenge was to introduce the museum’s themes through showcasing the archived collection and inspire further exploration of the museum. It was also important the design evoke the sensation of the sea itself. Being a permanent exhibition the design had to be simple and timeless.

The installation is composed of three conceptual layers. The Visual Journeys layer consists of images that cascade across the surface, each set relating to one of the museum’s themes. Beneath is the constant Ebb & Flow layer; a typographic ocean containing archival meta data and transcripts taken from interviews about the sea. Finally the Navigation layer is encased within the Puffersphere projector which continually collects relevant words from the sea and prints them on its surface.

Our role was to translate these conceptual layers into a code driven visual system. We developed bespoke software for projection mapping, procedural animation and spatial sound, built with open source coding platform openFrameworks.

The following is an in-depth case study of how we executed the Voyagers project, written from the perspective of Flightphase’s Technical Director James George

In the autumn of 2010 I was renting a spare desk at the The Light Surgeon’s studio in Hackney while stopping over in London. At the time they were researching ways to move forward producing a commission they had been awarded from the National Maritime Museum. The commission was to create content for a massive structure in the new Voyagers gallery. The 65-foot (20-metre) polygonated object resembled a cascading wave and at one end was met with a Puffersphere spherical projector. The content needed to exhibit the museum’s themes through archival images from the collection as well as depict a representation of the ocean.


CAD model from the architect

Initially the museum intended to use Watchout, a commercially licensed video playback solution. The Light Surgeon’s and I were convinced that this wasn’t sufficient for precise architectural mapping. After a few rounds of back and forth, the museum agreed a bespoke software solution was needed and Flightphase was brought in as official collaborators.

Architecture & Visualization

Our approach was to pre-visualize the entire installation in an openFrameworks based 3d environment. The system for previewing would be translated into the actual installation application once onsite, making the final experience faithful to our visualizations. We began with the architect’s CAD drawings, extracting the screen surface geometry and importing it into openFrameworks.


openframeworks visualizer

Design & Layout

Considering how to incorporate the museum’s collection within the design we initially proposed an image analysis system to generate layouts and animations with highly fragmented imagery. We went through several design iterations as the museum was hesitant about haphazard juxtapositions of their collection. The generative approach was ultimately replaced in favor of a series of specifically designed layouts, each relating to singular themes or visual patterns.


illustrator layout

To support The Light Surgeon’s designers in creating these layouts, we design a system for processing Adobe Illustrator layouts saved as SVG files. SVG is really just a format of XML, which made for easy integration into openFrameworks. The layout template was derived from the architect’s diagrams and our openFrameworks system parsed the files to copy the linked images, resizing them and placing them within the appropriate facet based on the cropping determined from the template.


layout previewed in the visualizer

Triangulation & Animation

In order to animate the layouts we needed to model and subdivide the structure. The design directive from The Light Surgeons was to create a set of very simple animations by subdividing the facets into smaller triangles which fade on and off in cascades based on their physical space. We wanted a system that would allow for flexible timing, directions, and triangular patterns so that we could tweak the animations once we were onsite.


sub-triangulated structure

To animate the structure we defined the temporal relationships between each facet by assigning each each a flow direction, a sequence position, and duration.


sequencing keyframe interface

To design different animation sets, we defined one set of parameters and configured them independently. All the varying cascade animations in the installation are derived from different mixtures of the same set of parameters.


three different animation sets

Ebb & Flow

The Ebb & Flow layer continually churns beneath the layouts as they animate across the surface. The ocean consists of a wave simulation based on Jerry Tessendorf’s ‘Simulating Ocean Water’ paper from 2001. Our particular implementation was derived from the open source Ocean Wave Shader built by the Unity3d community. At the same time we were developing our approach to Voyagers, we were developing our installation Forth, which was using the community built ocean.


Forth project

Initially we explored how to convert the visual style of the ocean into something conceptually fitting for Voyagers. We first iterated through more graphical and high contrast visualizations of the ocean, then finally settled on the concept of using typography to describe the ocean surface.


graphical specular ocean (in Unity3d)


initial typographic ocean rendering (in Unity3d)

Once we were convinced the real time typographic route was the way to go, we converted the Unity3d ocean to C++ in openFrameworks and integrated it into the installation architecture.


view for texture mapping the ocean

The words in the ocean are taken from keywords assigned to each image in the layouts as well as transcripts from interviews conducted with the public about their relationship to the sea. The interviews are on display elsewhere in the gallery, so the text provides a conceptual bridge between the two parts of the installation. The final rendering system uses an openFrameworks implementation of the ocean simulation and a custom typography tool.

Both the tools to visualize typography and the openFrameworks ocean simulation are available open source from our github page.

Puffersphere Navigational Layer

As the images cascade across the surface of the structure, the Puffersphere projector seated at the end of the wave collects keywords that relate to each image and prints them on the surface of the sphere. The end of the cascade of images results in a set of words that describe the theme connecting the images. All the while the object is perceived to be floating atop the waves that cascade underneath with an internal gyroscopic force keeping it upright. During periodic phases the Puffersphere assumes the image of the moon and constellations as the sea flows beneath.

The Puffersphere presented a massive design and technical challenge. Our goal was to use typography, video, and animations to depict the Puffersphere as an abstract machine inspired by the gyrocompasses and armillary spheres contained within the museum’s archive. But the first task was to come up with a way to depict anything at all!

suffering with the sphere

The sphere is internally projected from below with a stock projector and a custom wide angle lens. This results in the need for two-dimensional images to be distorted such that when they project into the globe appear perfectly spherical on the surface. We used a lot of info about spherical projection from Paul Bourke, and a bit of sample code provided by Pufferfish, and a lot of trial and error to come up with a system.


debug view of our Puffersphere framework

The process we designed allowed us to animate in a format native to the Puffersphere. It also allowed us the ability to preview the animations in several different formats depending on the content. The format is also compatible with a lot of freely available assets designed for computer rendering; namely our moon and earth images.


debug view of the navigational layer designs

We’ve published an openFrameworks addon that encapsulates our way of distorting assets for the sphere display in hopes of saving other designers from going through what we did.

ofxPuffersphere source code on Github.

Mapping

We split the visualizer application into pieces, running the Puffersphere and facet renderers across the machines that drive the installation. Even with optimizations, the visualizer won’t run at full framerate as its all on one computer. By splitting the application onto several machines we can have much higher resolution and fast framerates.


full installation in the visualizer

The individual renderer applications use a custom mapping interface that allows for near perfect alignment and distortion correction per facet.


mapping interface for the far left projector


mapping mode onsite

Sound & Synchronization

The soundscape is created by programmatically triggering sound design elements as the installation plays out. We use a SuperCollider application and OSC communication for multichannel playback.

The surface is covered by seven projectors run across six equivalent commodity level Windows PC’s. We use a modified version of Dan Shiffman’s Most Pixels Ever library to synchronize the application playback across the computers. We’ve published the modifications to the system as well as built a parameter sharing component.

Most Pixels Ever additions
ofxNetworkedParameters

Installation Credits

Direction & Production: The Light Surgeons
Bespoke Software Design & Generative Animation: Flightphase

The Light Surgeons
Creative Direction: Christopher Thomas Allen
Producer: Alice Ceresole
Design & Animation: Tim Cowie & Dave Baum
Sound design: Jude Greenaway

Flightphase
Technical Direction & Lead Software Development: James George
Software Development: Timothy Gfrerer