Chyron Releases PRIME Platform 4.3 With Viewer-Controlled Interactive Graphics

Chyron has announced the 4.3 release of the PRIME Platform, which features a full suite of production capabilities including live graphics, production switching, video walls and scaling, touch screen control, branding, venue control, and augmented reality.

Version 4.3 of this flexible, scalable platform introduces the new PRIME Edge module for creation, deployment, and management of viewer-controlled interactive graphics. Working with PRIME Edge, designers can use the same familiar PRIME Designer interface to create graphics with interactivity that allows viewers to determine what they see, whether watching sports, elections, financial news, lifestyle shows, or other programs.

“Imagine the power for viewer retention if you give the viewer the ability to ‘surf’ content without ever leaving your stream,” said Chyron Vice President of Marketing Carol Bettencourt. “For example, while watching election coverage, the viewer could select local or national results, commentary by party or candidate, statistics, and polling data, all of their own choosing. This capability also opens up a tremendous opportunity to increase revenue. Consider a sports broadcast. If viewers can choose the team, player, stats, and replays they want to see, this offers multiple opportunities for sponsorship, instead of just one.”

In addition to the introduction of the PRIME Edge module, the PRIME Platform 4.3 release brings significant new features to some of its other key modules.

Along with enhanced text formatting, the PRIME CG now includes a QR code effect, allowing the generation of a QR code for placement in a PRIME scene from directly within the application. In this streamlined workflow, users can dynamically replace the URL to generate QR codes to direct viewers to online content or sponsor content. Other new features support the PRIME Platform’s utility across a wide range of use cases. Support is now available for 23.98 and 24 FPS, commonly used by Hollywood studios and others to create long-form videos. Additional support for the NMOS protocol ensures device discoverability and connectivity, underscoring Chyron’s commitment to delivering IP-ready solutions.

The PRIME Switcher module now features UI enhancements and multiviewer improvements. Most notably, the PRIME Switcher includes controls for creating standard DVEs within the switcher interface in addition to the previously available tools for creating DVEs upstream in the PRIME CG interface.

“Development of the PRIME Platform continues to be driven by customer feedback and by the principle of helping customers move toward the future, bridging traditional, single-purpose hardware solutions with modern integrated solutions that may be deployed on-prem, in the cloud, or in a hybrid architecture,” said Chyron Senior Vice President of Strategy Mathieu Yerle.

You might also like...

Audio At NAB 2025

Key audio themes at NAB 2025 remain persistently familiar – remote workflows, distributed teams, and ultra-efficiency… and of course AI. These themes have been around for a long time now but the audio community always seems to find very new ways of del…

Production Control Room Tools At NAB 2025

We return to the high pressure world of the Production Control Room where Switchers, Replay and Graphics are always at the heart of the action. The 2025 NAB Show will bring a myriad of new feature releases and opportunities for hands-on…

Remote Contribution At NAB 2025

The technology required to get high quality content from the venue to the viewer for live sports production remains an area of intense research and development, so there will be plenty of innovation and expertise in this area on the…

KVM & Multiviewer Systems At NAB 2025

It’s NAB time again. Once again, as we head towards the show, we will take a look at the key product announcements across a range of key technology and workflow areas. We begin with the always critical world of K…

Sports Production Infrastructure – Where’s The Compute?

The evolution of IP based production and increased computer processing power have enabled new workflows, so how is compute resource being deployed to create new remote and hybrid approaches?