Conversations@the Studio

Technology

Projection system

Conversations@the Studio utilises a 3m dome as the surface for 180 degree projection made possible by a projector equipped with a fish eye lens. The lens is close to the centre of the sphere to ensure total coverage of the screen with minimal distortion. The size and shape of the projection set-up covers the peripheral vision of the user standing directly in front of it and thus ensures a truly immersive experience.

dome
Installation 3D model

360 degree global recording

The video for Conversations@the Studio was shot with a Ladybug camera system from Point Grey Research. It allows digital spherical recordings at 360 degree horizontal and 240 degree vertical field of view. The camera has a tightly packed cluster of 6 CCD sensors with wide angle lenses and a slight overlap between the images. In post production the individual frames were colour and geometrically corrected and stitched to a high resolution equirectangular image (3600×1800 pixel). Custom algorithms where used to double the frame rate from 15 (the camera’s maximal frame rate) to 30 frames per second. The final delivery format of this 30 min movie is MPEG-2 at a very high bit-rate to allow for high quality playback.

glass_ladybug
Recording

Directional Audio recording

The sound was recorded using a 4 channel Sound-Field microphone accompanied by a 5.1 Holophone. The individual directional sound channels are mapped to a point of view within the video and synchronised for playback in post production.

Playback engine

A custom built 3D software engine takes the high resolution video stream from a disk array and uses it as a texture map projected on the inside of a sphere. The geometric correction for the fisheye lens is undertaken through a high resolution distortion mesh in the render pipeline.
The video texture is translated inside the sphere according to the point of view of the user, this simulates “looking around” with in the scene.
A look up table supplies the playback engine with location and time/duration of a video close-ups or hot-spots. If the point of view is within a hot-spot area in time, a close up is superimposed as a second layer on top of the main movie.

glass_screenshot
Distrorted video texture

Interaction

The spherical movie is projected inside the dome with a coverage of 180 degree. A track ball integrated in the projector stand allows the user to rotate the projection freely while the movie is still running. Together with the image the multi-channel sound field is rotated accordingly. A simple vector based panning algorithm is used to distribute the multi-channel sound direction dependent on the user’s principle point of view.
A control button activates a voice over commentary channel, whose content is related to the point of view and the time within the main spherical movie. This voice channel is mixed into the centre channel of the directional ambient audio. A second control button allows fast forward play back of the movie.

glass_projectionstand_sketch
Integrated projector stand