A cornerstone of the exhibition, the IBC Future Zone is home to exhibitors with cutting edge projects and prototypes liberated by IBC from the worlds leading R&D labs and universities. Featuring a range of exhibitors from across the globe, this is a must see during your time at IBC. Discover more of what will be […]
A cornerstone of the exhibition, the IBC Future Zone is home to exhibitors with cutting edge projects and prototypes liberated by IBC from the worlds leading R&D labs and universities.
Featuring a range of exhibitors from across the globe, this is a must see during your time at IBC. Discover more of what will be available with our exhibitor highlights.
BBC R&D – 8.F20
At the heart of IP Studio is the idea that every piece of video, audio and data is given a unique identifier and a timestamp as soon as it is captured or created. This allows users to find and synchronise the most appropriate content wherever it is needed.
BBC R&D is demonstrating various instances of this approach. CAKE, Cook-Along-Kitchen-Experience, is a real-time, interactive cookery show that changes as you cook with it using a mix of AV and text. This is the broadcaster’s first wholly object-based experiment from production to experience.
IBC visitors can view projects created as interactive VR worlds including Aardman Animations’ We Wait, which puts viewers on the frontline of a migrants perilous sea crossing from Turkey to Greece; the fantasy The Turning Forest which incorporates interactive spacial sound produced with film maker, Oscar Raby, and see The Vic, which tests TV on-set environment modelling by allowing users to explore perhaps Britains most famous pub The Queen Vic from EastEnders.
Minglvision – 8.F29
Pressing a ‘red button’ to view alternative camera angles can lead to 5 or 10 second delays for the new video to start.
Our number one goal is to let users switch between camera angles really, really fast, so there is no buffering, says Artem Kiselev, CEO and Founder, Minglvision.
The London-based start-up claims to have developed the first truly personalised multi-angle/multi-game broadcasting player and infrastructure.
It has been trialled during several Formula 1 races, with results that leads Kiselev to claim Minglvision will increase average ratings and advertisement open rates by 40%.
Essentially, for the first time, people can actually interact with what theyre watching, he says.
Dreamspace – 8.F05
The ability to assemble, play with and even improvise virtual productions will have profound creative consequences for the media industry, performance groups and creative artists, says Sara Coppola Project Manager at The Foundry for Dreamspace.
The EC-funded project is showcasing innovation in depth capture, live compositing, a collaborative virtual production environment and results from experimental productions and performances conducted so far.
Project member Ncam’s technique allows live-action content to be positioned in depth, making it possible for performers to be placed inside a virtual environment rather than just composited as a foreground overlay. iMinds is developing new depth-aware 360 video stitching and The University of Saarland is developing a renderer to solve high-performance computing problems.
TrueDR – 8. F27
High Dynamic Range (HDR) clearly provides an enhanced viewing experience but you may have to watch in a dark room to fully appreciate it. The reason is that TVs currently marketed as HDR do not, in fact, provide an overall brighter image, says Alan Chalmers, a professor of visualisation at Warwick University.
TrueDR’s solution in the Future Zone combines a novel codec with a 10,000nit display developed by Italy’s SIM2 the first time this has been shown.
Current HDR pipelines are ‘display referred’, constrained by the peak luminance that consumer HDR displays are capable of showing, typically 1,000nits, explains Chalmers. Our technique is scene referred in which the full range of lighting captured in a scene will be transmitted along the pipeline and the best image possible delivered for a given display, the ambient lighting conditions, and any creative intent.
SAM – 8.F26
Socialising Around Media (SAM) is about content finding the users second screen through syndication. The EU-funded multi-partner project, coordinated by TIE Kinetix, aims to change the ‘inactive viewer’ into a ‘proactive prosumer’ with social media interaction and decision making on content. Demonstrated in the Future Zone is a smart TV native app with multiscreen related content on various mobile devices as well as a second screen production platform to produce a second screen experience.
The latter can import and suggest metadata from internal data repositories and external content such as Wikipedia and synchronises the related content with the primary video. While the EUs FP7 project ends shortly after IBC, next steps focus on exploring opportunities for actual deployment. Adding audio mining technologies and perhaps even video recognition would greatly enhance the effectiveness of automating synchronisation, SAM says.
– See more at: http://www.ibc.org/hot-news/discover-cutting-edge-technology-in-the-ibc-future-zone#sthash.G0Bdpdh1.dpuf