Understanding The Client-Side OTT Customer Experience
The criticality of service assurance in OTT services is evolving quickly as audiences grow and large broadcasters double-down on their streaming strategies.
Gone are the days where good enough is good enough. Audiences are quick to criticise when there are basic issues with a service, especially when they are watching exclusive live content. Client-side monitoring and analytics has been a staple of the D2C streamer technology stack for many years. It continues to evolve to measure and report the holistic customer experience in real-time, which now includes combining forces with network-side monitoring in order to detect the actual source of quality issues.
What Is The OTT Customer Experience?
Let’s start with the premise that
OTT Customer Experience = Content + User Experience (UX) + Quality of Experience (QoE)
We can (and must) debate the relative importance of each, but let’s just say that all three elements are sufficiently important to really matter to the success of the OTT operator’s business.
In this article we’re looking at the technical aspects of OTT, so while we may easily accept in some instances that content is king, we will focus here on UX and QoE. These domains are becoming the primary battleground for customer satisfaction as OTT services become the exclusive home of high-profile content like live sports, and as competition heats up between OTT service providers.
Quality Of Experience And User Experience
QoS (Quality of Service) and QoE (Quality of Experience) have been standard terms for many years. QoS is focused on technical performance of the video stream from the engineering perspective – are all the video bitrates available to the consumer, how much are the video bitrates moving up and down their ABR ladder during a session, what is the video latency, etc.? QoE looks at the same things from the customer’s perspective –e.g. how fast does the video start-up, did the video fail to start, how often does the stream rebuffer, how many customers are actually receiving the top bit-rate, etc.?
For a D2C service, the customer’s perspective (i.e. QoE) is of paramount importance. OTT QoE should be compared with QoE from the pre-existing satellite and IPTV/Cable TV services, and then OTT service providers should act accordingly on the delivery infrastructure to fine-tune performance. QoS remains important at the engineering level and should be used to establish overall technical health of the platform and its ability to deliver what the customer expects. But QoS alone can hide the realities of the customer experience, for example by focusing on the average bitrate or the availability of a stream when in fact an individual customer might experience lower-than-requested bitrates or even video outages like black screens on a valid stream during the most important part of the program. Moving from aggregate QoS metrics to individual QoE metrics is an essential part of succeeding in OTT.
Just as QoE trumps QoS, UX (User Experience) can trump QoE. UX reflects the experience of the user with the usability of the service, which is measured by Application analytics. Measures such as time-on-site, number of clicks, app performance, content placement and consumption method provide a perspective on ease of use and ease of finding the desired content. These measures can then have appropriate targets set based on focus group research and best practice expectations, with fine-tuning implemented accordingly.
Figure 1 – Service Quality Management takes input from multiple areas to establish the best approach to improving the customer experience.
Leading OTT service providers focus on a range of actions to satisfy their customers. They adapt their apps to extend video performance and customer consumption. They identify the best performing routes and navigation through their services to match their technical quality goals. And they evolve and migrate their apps to keep up with new technologies and innovations.
But whether QoE is more important than UX or vice versa, the key point is to measure both and correlate them with each other to detect cause and effect patterns, which can be used to proactively make system-level changes during streaming events to improve the customer experience. This machine-captured data, measured from a technical and customer perspective, can be combined with actual customer feedback and re-correlated to establish the most intelligent approach to service improvements. This Service Quality competency should exist at the core of the D2C streamer.
Applying The Necessary Tools
D2C streaming produces a lot of data. It is inherent in the fact that individual streams are being delivered to individual people on a specific device type, so every customer experience is unique. D2C streamers need to therefore understand what is happening. It is very different from the traditional broadcasting world.
Data-driven decision-making is therefore possible, and necessary. As any D2C business knows, you need to be able to trust your data and use it to make decisions about what to change in your service delivery. Once you are considering a change that will impact thousands or millions of consumers, the risks of getting it wrong are too high, and decision-by-opinion is too risky. For OTT video this type of change decision could include a permanent change to a CDN, real-time load-balancing between CDNs, offering a higher bitrate stream as a standard feature, or restricting bitrates during peak-time viewing.
Data comes from multiple sources – the client, the player, the ISP (multiple), the CDN (multiple), the head-end (on-prem / multi-cloud variants). To trust the data there needs to first be an independent and objective view – a single point of truth - of the performance of the content delivery ecosystem, which client-side monitoring tools can provide for the whole customer base. Second, the data source itself must be trusted and understood. This is where client-side monitoring solutions earn their keep. Validating the accuracy of the data from all elements of the streaming ecosystem is a continuous task. In addition, D2C streamers must work closely with all their technology partners to ensure the data is clean and correctly interpreted. For example, different CDNs will report different data sets or use different algorithms to calculate performance. As OTT services mature, these sorts of discrepancies cannot persist or else the concept of “data-driven decisions” becomes a false reality.
Searching For The Truth
Leading client-side monitoring and analytics solutions are pushing the boundaries of data source integration to reach the ultimate truth. By integrating network-side video monitoring solutions that understand video QoS and QoE the client-side solutions can quickly correlate video delivery data with customer application data and present information back to the OTT service provider.
Instead of just knowing that a group of customers are experiencing a drop in video bitrate, but not knowing why, these integrations can explain why. But a word of caution – get the data interpretation right before jumping to conclusions! Assuming one CDN will perform better than another might miss the fact that the ISP’s network is the issue, not the CDN’s servers. It certainly would be worth trying a CDN switch if that is the best available information, but really the decision-making data should include ISP performance. CDNs that are focused on broadcast-grade streaming performance, and network-side Monitoring solution providers are working hard to gather ISP-level data to complete the picture and present it up into a single point of truth.
On the client itself, the data gathered in real-time must not impact QoE or UX. Software algorithms collecting data must not impact player or device performance. So minimal load and lean code is a must. External processing of data is also a must. Latest implementations focus only on listening to the events on the device, with data and decision-making transferred to highly scalable back-end systems (often cloud-hosted) for analysis, reporting and CDN selection decisions.
Changing Behaviours
D2C streaming involves a set of dynamics that are different from traditional broadcasting. Although it is true that some people may point out that the broadcasting industry has been following these approaches for years, just working with different technologies along the way.
First, D2C means being customer centric and taking the customer perspective about service experience. Although this is easy to say it can be complex to do in practice. Paying customers in particular need to be understood from a UX and QoE perspective, considering the impact of a poor experience could be more costly to the OTT service provider.
Second, D2C requires end-to-end views, which means having insight into all the elements of the distribution technology. In practice that involves a combined view of the network infrastructure and the end customer’s experience on the device. Neither client-side nor network-side on their own can have a complete picture of the customer experience nor can either side suggest appropriate corrective action in isolation.
Third, D2C requires a data-driven approach. We are now equipped to know an individual customer’s experience in detail, and service outages can be identified and the impact on each individual customer can be understood. The challenge is to have a solution that is scalable and does not slow down through any form of overloading, so we should look to elastic solutions rather than solutions limited by hardware capacity.
Fourth, D2C requires a proactive and real-time approach. Once customers are impacted the damage is done, and in a worst-case scenario the OTT operator’s business is disrupted by a flood of complaints, enquiries and cancellations. In the technology stack, the CDNs are touching the video streams as they approach the consumer, and clients are processing the video stream and the UI. It is important to build very close relationships between each technology domain so that intelligent proactivity is designed into the software algorithms, and so that automated, real-time actions can be taken in response to the performance measures.
Fifth, D2C requires agility. Circumstances evolve quickly in the consumer world, and slow approaches are not good enough. So, recommendations from industry technology leaders are to move to DevOps instead of Waterfall development approaches; use customer experience testing to validate multi-generational device versions and then provide feedback to vendors and perform continuous integration testing; get faster access to data in intuitive ways with voice recognition tools within the analytics process; and avoid slow ticketing systems between suppliers and move to a shared single-point-of-truth perspective between parties in the chain.
The Customer Is King
While content is always the creative battleground, UX and QoE are becoming equally important, and the customer is the ultimate judge.
Content and UX are very much in the hands of the OTT service provider. But QoE is generally outside of their direct control. They rely on a complex multi-tenant network and server architecture that is often being used by many other parties for many different uses. The trends towards DVB, multi-CDN and multi-cloud add more complexity. And while this is happening it has never been more important for OTT service providers to proactively assure QoE for their increasingly demanding customers.
QoE and UX need to be seen together. They are intrinsically linked, and they are both highly technical subjects that need intensive data analysis. The monitoring and analytics tools should combine and correlate the views from both worlds.
In the end, customer care is the single most important feedback loop for the D2C streamer. What customers are really experiencing and how we are really managing any issues should be a core competency of the D2C streamer, who will ultimately be distinguished in the market based on service delivery, not just content.
Part of a series supported by
You might also like...
Designing IP Broadcast Systems - The Book
Designing IP Broadcast Systems is another massive body of research driven work - with over 27,000 words in 18 articles, in a free 84 page eBook. It provides extensive insight into the technology and engineering methodology required to create practical IP based broadcast…
Demands On Production With HDR & WCG
The adoption of HDR requires adjustments in workflow that place different requirements on both people and technology, especially when multiple formats are required simultaneously.
Standards: Part 21 - The MPEG, AES & Other Containers
Here we discuss how raw essence data needs to be serialized so it can be stored in media container files. We also describe the various media container file formats and their evolution.
Broadcasters Seek Deeper Integration Between Streaming And Linear
Many broadcasters have been revising their streaming strategies with some significant differences, especially between Europe with its stronger tilt towards the internet and North America where ATSC 3.0 is designed to sustain hybrid broadcast/broadband delivery.
Microphones: Part 2 - Design Principles
Successful microphones have been built working on a number of different principles. Those ideas will be looked at here.