Broadcasting used to rely solely on proprietary technologies. That isnt surprising considering the massive amounts of real-time information that broadcasters deal with on a daily basis. But now, broadcasting is merging, in a sense, with IT technologies Early on in the days of IT, computer people were amazed when they could process and display video […]
Broadcasting used to rely solely on proprietary technologies. That isnt surprising considering the massive amounts of real-time information that broadcasters deal with on a daily basis. But now, broadcasting is merging, in a sense, with IT technologies
Early on in the days of IT, computer people were amazed when they could process and display video even if the earliest demonstrations showed video the size of a postage stamp. Today, the offspring of those earliest personal computers have given rise to nonlinear editing systems with the power to produce a Hollywood blockbuster on a laptop.
Today, the penetration of IT into the broadcast infrastructure may seem commonplace. Its happening on a global scale with the increase of open IT (software that isnt closed to use or manipulation by a third-party) within broadcast.
But what is IT and what does it hold for broadcasters in 2013?
While the IT world is vast, broadcasters have concentrated on specific features: networking, processing, and storagewith the latest trend being cloud storage for non-real-time/near-line solutions. Obviously, this changes the paradigm of what many of us have thought of as the heart of broadcastings infrastructure just a few years ago. Networking opens up new options for broadcasters, but equipment manufacturers need to have a good understanding of the IT operational constraints within the broadcast workflow in order to understand how an IT-based infrastructure can be utilised from an application level capability standpoint. Throughout its history, the broadcasting industry has gone through a variety of technology transitions: black and white to colour, mono to stereo, stereo to multi-channel, and most important, analogue to digital and standard definition to high definition. Today, the transition is from hardware-centric to software-centric solutions.
For example, while many broadcast equipment manufacturers have been leveraging IT for years with individual products, the industry is now moving towards integration of the powerful application layer which is responsible for communications protocols and process-to-process communications across a network. As infrastructures become more computerised, manufacturers will be able to expose users to an integrated system across operational, technical, and business levels.
File-based media has become the killer app for many consumer-centric companies, with storage and archive solutions allowing tens of thousands of users to share millions of clips or full programs worldwide. Just consider for a moment, the IT power behind names such as YouTube, Netflix, Facebook, and other media-centric and social networking channels.
Broadcasters can reap tremendous capabilities by leveraging IT and off-the-shelf hardware, but only when it is properly optimised for the I/O, rendering, and infrastructure requirements of broadcasters with dedicated software applications. This mix of IT and broadcast technologies is needed because there are special considerations for an industry that requires real-time, high data rate files to be where they need, at exactly the right time and with no errors. Plus, the added requirements to handle access and management of the media, also facilitates the need for a strong hardware and software infrastructure.
One area where IT has already had a major impact on broadcasting is in servers. Initially called disk recorders, the first devices made it possible to record and play at the same time and to allow multiple users to access the same content seconds after recording.
Continued development further leveraged IT, with disk recorders that were open, scalable, and flexible, providing a foundation for a dynamic environment and todays video and media servers.
Today, leveraging IT has lowered the cost of high-speed server networking by using common technologies such as Gigabit Ethernet. This not only reduced prices compared to earlier servers and disk recorders, which used more expensive Fibre Channel technology, but opened up servers for use with nonlinear editing systems. iSCSI (Internet SCSI) networking protocol running over the top of Gigabit Ethernet, allows servers equipped with those technologies to exchange and deliver the high bandwidth requirements associated with real-time video.
Most recently, media servers have come to market with high-resolution and high performance, with the additional ability to create and deliver low-resolution proxy content for browsing and media management. In the proper circumstances, these low-resolution proxies can be used for the complete workflow. This can give broadcasters a great amount of flexibility as low-resolution proxies can easily be used on laptops with nonlinear editing software and uploaded and downloaded almost anywhere an internet connection is available. The high-resolution images are only needed for final output, based on the edit decision list created with the low-resolution content.
The industry has also seen the integration of different operating systems for mission-critical 24/7 playout applications, namely Linux and Unix-like QNX, to avoid the issues associated with Windows operating systems.
Another example is one where broadcasters never thought that IT could deliver: cameras.
CMOS imagerswhich are more like integrated circuitsare becoming more common compared to traditional CCD sensors. While CMOS is used in everything from cellphones to DSLR cameras, its major obstacle for mass adoption in broadcast is whats called rolling shutter.
The rolling shutter of the regular CMOS imager produces images similar to scanning from a Plumbicon tube, because of line-by-line scanning. Shorter exposure times will increase the visibility of the skewing effect.
Understanding the unique needs of broadcasters, global shutter behaviour (similar to IT CCDs) is just being introduced in CMOS-based cameras designed specifically for broadcast. The global shutter will expose each frame for 1/50 or 1/60 of a second and show some blurring effects on fast moving objects. Shorter exposure times can be used to get sharper pictures, however this can create some shuttering effects. But unlike IT CCDs, this new generation of CMOS imagers does not produce any highlight smear under any conditions and can also include integrating multiple A/D converters, timing, and read-out circuits on the CMOS chip itself.
Leveraging IT in cameras also means that various models of the same camera family can be physically the same right out of the box, with software controlling the feature-set of individual camera models. By simply upgrading the software, production format flexibility and features can increased.
The key is the ability to apply the modern IT infrastructure into the broadcast environment. It cant be a straight integrationwe must leverage the best of IT to meet the industrys real-time and non-real-time needs.
I see one of the biggest struggles for broadcasters as the delivery of content to multiple distribution channels, each with its own format. This is a real business challenge as the income generated from these channels is nominal. A change in technology, but not operational workflowswill allow broadcasters to continue doing what they do, but do it better.
Traditionally, broadcasters have used a linear production model: images are acquired, manipulated, and then distributed. The struggle is how to repurpose linear content to multiple platforms. To do this, the industry must change to a nonlinear production model that accommodates nonlinear distribution and broadcast playout in a create once, publish everywhere (COPE) scenario.
Consider the following live production scenario, with a camera system built on an IT-based platform:
A camera operator at a stadium could push a button that sends metadata from their camera that initiates a set of actions.
While the cameras output still makes its way through the traditional production process via the switcher for traditional programme output, the metadata has automatically switched the online feed to its camera perhaps with automated graphics skipping the traditional linear production process of content manipulation via a manually operated switcher. In this aspect, the workflow is nonlinear and automated, compared to the on-air workflow.
How and what images are pushed to the web are based on a set of predetermined rules and automated workflows. But this power can be easily used to create an online highlight or home team feed for the web, giving content owners a possible secondary revenue source from this nonlinear production content.
Technologyboth broadcast and IThave finally advanced to the point that production, post-production, and distribution can merge into a single collaborative platform, providing multiple format outputs and increasing efficiency throughout the entire production chain. Nonlinear production is the culmination of the integration of the best of broadcast with the best of open IT, and will be the future of broadcasting for 2013 and beyond.
Karl Schubert is CTO of for Grass Valley and is responsible for identifying future technologies and developing a vision and technology roadmap for the future of the companys product lines.