The 30th May 2013 marked the final demonstration of the full FascinatE system. The project has developed a complete end-to-end future broadcast system which combines ultra-high definition panoramic video, 3D ambisonic and object-based audio, new methods for delivery of interactive AV content and new interfaces and methods to interact with the AV media at the user end. All of these aspects were on show for a live demonstration of the complete integrated system.
The demonstration event was hosted by The University of Salford at their MediaCityUK building which is one of the few places that had the complete set of facilities to support what we were trying to do, the infrastructure of the building was actually designed for this kind of thing and was tested to the limit by the FascinatE team for this event. FascinatE partners worked through the nights and the fantastic tech support team at MediaCityUK pulled out all the stops to make it happen smoothly.
The event featured a live performance in the Digital Performance Lab (DPL) which was recorded and streamed live to a conference suite (The Egg) on the other side of the building. The live demonstration was based around the premiere performance of 'Deeper than all roses', the latest large-scale music composition from Stephen Davismoon, featuring 'rock' band Bears?Bears! and live performance artists Joseph Lau and Shona Roberts in celebration of the works of the celebrated American poet E.E. Cummings.
Visitors were invited to view the performance being relayed live to the Egg presentation room. The Egg was equipped with a large data projector and a ambisonics spatial audio rendering system consisting of 17 loudspeakers. The live demonstration was divided into three phases, allowing specific aspects of the technology to be highlighted (full bandwidth interactive experience, content streaming on a lower bandwidth network and the virtual director). Visitors were also invited to look around the many offline demos which were on display in the foyer of the MediaCityUK building.
The performance was captured with the OmniCam panoramic camera, using the latest version comprising of 10 HD cameras capable of giving a 10k x 2k 360-degree panoramic image, of which 5 cameras were used for the demo to produce a 180-degree live video panorama. A manned HD broadcast camera placed close to the OmniCam captured close-ups of particular areas of interest. The video from this was ingested into an IP-based production framework, running a plug-in that generated camera metadata describing pan, tilt and zoom derived by tracking background features in the image.
Audio was captured using an Eigenmike to generate a 4th order Ambisonic representation of the ambient sound in the performance room. Six additional audio objects were also captured using mics on the guitar amplifiers, drum kit and the PA speakers used by the singers. The DPL also housed a WiFi access point, providing network access to a set of tablet computers that visitors could use to explore the live panoramic image.
All of this AV data was streamed in real time to a range of displays from TV sets, large projectors, a Christie tile wall in our building foyer and to iOS and Android mobile devices. People at the demo could freely pan and zoom around the panorama, controlling their own virtual camera by swipes on tablets or by intuitive hand gestures using Kinect for larger displays.