Focal Point VR recently provided the live streaming technology for a product launch; a major brand showcasing a fast rising band at an exclusive event in North London with a VIP and competition winner audience.
Nothing very new in there so far, despite the quality of the product and the band. The innovation here was that the event – the UK launch of the Samsung S8 smartphone, with Royal Blood performing – was streamed live and in high quality 360-degree video. The results were excellent in terms of both views (over 8.5M so far) and engagement.
360-degree video allows the viewer to completely immerse themselves in the stream. This simple level of interaction obviously gives viewers more engagement because they have something to do, somewhere else to look and an instant reward for focusing on the stream. But there’s much more to it than that.
Audiences are cynical. Years of carefully crafted video direction, photoshop and post processing have served to inoculate viewers against believing what they’re seeing. However the viewer’s ability in 360° to look everywhere, and to do so under their own control, gives this format dramatically more authenticity.
There is a price to pay from the content creator’s point of view though; that same authenticity makes it harder to hide what’s going on: camera angles and zooms are hard to implement, cuts can be disorientating and moving the camera can make viewers uncomfortable. Lighting and staging for the Samsung event was very carefully set specifically for 360-degree; there are definitely new techniques and approaches for any live event where 360-degree video is going to be part of the output.
To date, live streaming 360-degree at high quality has faced some significant challenges. The processing task of creating a single all round video stream from the multiple 4K cameras used to create broadcast quality 360° streams is considerable.
The decision about where to look in the stream has to remain with the viewer and networks don’t allow that decision to be sent to the server to cut down the field of view streamed to the user; the latency on the networks is too high and the result is usually a very juddery experience.
So, a high quality stream requires a lot of processing power and clever software at the source, a lot of bandwidth up to the cloud and then a highly-optimised stream down to the viewer.
STUDIO: Focal Point VR