With a bottom-up approach, beginning with the end user and working up to the headend, the media industry can reduce abandonment, increase engagement, improve repeat viewership and protect brands, writes Ted Korte of Qligent.
With a bottom-up approach, beginning with the end user and working up to the headend, the media industry can reduce abandonment, increase engagement, improve repeat viewership and protect brands, writes Ted Korte of Qligent.
The phrase content is king can no longer stand on its own. With the expansion of digital media and IP-based channels, the world is becoming a smaller place as consumers now have access to content across the globe. The growing availability of these services and faster broadband access have combined with advances in smart devices to empower consumers and have shifted control of the media landscape from content providers to end users.
If great content is all it takes to be successful, why are so many end users still cutting the cord? Having great content is only part of the competitive battle for providers these days, as price, accessibility, relevance, personalisation, quality and choice have all become critical factors in keeping viewers engaged.
Quality is one of the most crucial elements in retaining viewers; even the best content can be beaten in audience views by an amateurs cat video if service availability is limited or if buffer times, dropouts or blocky video are too excessive. A study by Akamai discovered that viewers will start abandoning a video if it takes longer than two seconds to begin playing, with another six percent leaving after every additional one second of delay. Meanwhile, an nScreenMedia poll uncovered that delivering better video quality has the biggest impact on the business of an online video service provider (OVSP).
Once multi-layer, last-mile and end-user monitoring and analysis are in place, instead of seeing demographics, genders and households, providers will start seeing individuals emerge Ted Korte, COO, Qligent
As the path from content creator to viewer has grown in complexity, consumer satisfaction has become increasingly difficult to capture and maintain, thus forcing a new approach to assure high-quality content delivery. The use of real-time big data analytics on a combination of last-mile and end-user data can deliver valuable insights into consumer quality of experience and user behaviour.
Why is quality suffering?
The push for more content and online delivery has resulted in a huge degradation in the quality of media service delivery across all platforms, new and old. New technical problems that didnt exist before were introduced as early as the analogue-to-digital transition. Broadband network distribution brought its own set of new problems with buffering, jitter and latency among the issues related to delivery over an unmanaged, packet-switched network. Additional technical problems will continue to arise as more insertion and embedding functions are pushed closer to the end user. Even measuring consumer satisfaction has changed since the analogue TV days instead of calling in complaints, consumers go straight to social media or simply walk away silently.
Meanwhile, content providers resources are more strained than ever before. More investment is made in front-end content creation, new technologies such as augmented reality (AR) and social media integrations than in delivery considerations. Even non-content projects such as IP studio buildouts and workflow virtualisation are competing for scarce resources.
To combat resource shortages, some media companies outsource major workflow functions such as master control operations to third parties, which raises new problems to contend with in managing the end-to-end content chain.
Analysing content delivery
The traditional approach to monitoring content delivery has been a top-down approach. This process evolved from supporting a linear broadcast model, was deployed when purpose-built hardware was popular, and was mainly focused on infrastructure health.
This legacy model for monitoring has a number of deficiencies when applied to todays modern media delivery challenges, including limited visibility (blind spots), inability to reach out to the last mile or end user, and limited depth (quality of service QoS only, not quality of experience). With respect to customer impact, it can also only report mean estimations, not data about individuals.
This top-down hierarchical tree approach follows the distribution and delivery network itself, but falls apart if the network isnt fully managed end-to-end. The new IP-based, non-linear, one-to-one model for monitoring must embrace a completely new interactive paradigm shift. It should start with a bottom-up approach, beginning with the end user and working up to the headend or origination point.
Broadband delivery offers an affordable loop-back channel to aggregate quality and behaviour statistics from every subscriber. Cost-effective software agents can sit on set-top boxes or in apps that present the media to the end user, and capture stream quality and user interaction data. This data set is large, fast-moving and uncontrolled as users come and go for various reasons, but can be analysed together with controlled last-mile data, consumer device info, link health, stream quality, problem reports, customer experience management (CEM), user feedback and a variety of other data sets to extract a holistic view.
Once multi-layer, last-mile and end-user monitoring and analysis are in place, instead of seeing demographics, genders and households, providers will start seeing individuals emerge.
Where does big data fit?
This new approach to delivery and consumption analysis allows precise measuring of quality characteristics on a per-user basis in real time, instead of traditional mean user quality metrics based on large network segments representing a huge population. This is very important today, as individuals can make enough noise on social media to affect companies. The goal now is to make everyone happy, not just the majority.
Interactivity introduces a user-centric view where data flows from subscriber end-points towards the monitoring centre, where it is enriched, validated or otherwise explained by other data sets. Finally, we see the consumption preferences, quality metrics and behaviour habits as users react, interact and engage with the content. By performing these analyses in real time with the ability to focus on specific customers, immediate action can be taken whenever problems occur.
Big data technology empowers analysis like never before, making it possible to augment, blend, enrich, exclude, filter and associate data across endless data pools, lakes and oceans. This enables us to search, trend, visualise, automate, recommend, alert, predict and prevent at enormous scale.
Four common goals of big data in the media and entertainment industry are to reduce abandonment, increase engagement, improve repeat viewership and protect brands. Knowing the quality of service (QoS) and quality of experience (QoE) in a controlled manner are foundational for measuring any of the other attributes of the enduser experience. Before audience measurement makes any sense at all, we need to know if the audience is actually receiving a high-quality content experience as it was intended to be consumed.
Big data is the solution most are looking to, but like any new technology it has its pros and cons, and it is very important during planning to mitigate the risks.
The good news is that such solutions are already used successfully in many other industries and business sectors such as retail, cellular networks, finance and banking.
The big data technology stack includes methods and algorithms for aggregating, processing and storing of petabytes of structured or non-structured data, parallel processing, machine learning and artificial intelligence techniques, and it can be leveraged by the media and entertainment industry.
To minimise uncertainty in collected data, it has become best practice to combine multiple data points with known reference or control points that help to augment, enrich and validate information. This is why monitoring the last mile, or consumer environment, comes into the mix with end-user analysis: to remove the human element from the equation. Without last mile data, it is not known if end-user feedback can be completely trusted.
As power in the media landscape has shifted to end users and content quality has taken a back seat to quantity, content is king will only apply again when big data says it does. Until then, the emperor has no clothes.
With the advent of big data, media companies can now understand why customers subscribe and unsubscribe, what kind of programmes they like and dislike, when they like to watch certain content, and their tolerance for poor audio and video quality. Supplementing big data with data augmentation mitigates the risk of poor data, empowered by a new bottom-up approach to monitoring end-to-end media path, as controlled data collection sources to correlate with uncontrolled data from end-user probes.