Inside EVS Efforts in Expanding Beyond Replay, Embracing AI, and More Increasingly, customers link EVS solutions with live production in evolving ways By Ken Kerschbaumer, Editorial Director Tuesday, September 24, 2024 - 10:02 am
Print This Story | Subscribe
Story Highlights
At the recent IBC, EVS once again embraced its This is not a replay campaign, an effort to help the industry understand the evolution of the company over the past 30 years and, in particular, the past five. With new products and services driven by AI as well as an expanding media infrastructure and asset-management product portfolio, there was plenty of technology to prove the point that EVS is about much more than replay.
At the EVS IBC stand, artist Lucien Gilson (left), with the original artwork he was painting as part of the EVS This is not a replay campaign, and EVS's Alex Redfern (center) and Sebastian Verlain
More and more customers, according to EVS CTO Alex Redfern, are buying more than one solution from EVS, linking their live operations, which rely heavily on the EVS replay environment, with a media-production team that uses those assets in different ways. Cerebrum installations, for example, can start off small and expand, linking servers with existing routers and even supporting tally integration and talking with different production tools.
It's the spider's web that connects everything together, he says. They use us for their routing, glue, and orchestration environment. They have the XT-VIA servers, and we show them how the live section and media side can be integrated. All the controls are there at the operator's fingertips.
EVS Head of Marketing and Communications Sebastien Verlaine notes that the rollout of VIA MAP (Media Asset Platform) version 1.0 this summer also demonstrates the evolution of EVS into areas like AI-based video assistance. The platform comprises LiveCeption for live replay and highlight needs, MediaCeption for asset management, MediaHub for distribution, and PowerVision for reviewing and mission-critical decision-making. It is an outgrowth of the 2020 Axon Digital Design acquisition, deploying its MediaInfra IP-backbone-based solutions for broadcast control, monitoring, conversion, and processing within an SDI, IP, or hybrid infrastructure.
Adds Redfern, At NAB 2024, we saw all of our key customers in the U.S. come with a primary focus to talk about MediaInfra, and we are more and more confident that people are coming to us knowing about the replay capabilities but wanting to see the new stuff.
AI-assisted everything was a major trend at nearly every IBC stand, and EVS efforts around tools like offsides VAR with the help of AI is just one example of how EVS is using AI to help referees make calls more quickly. And its XtraMotion uses AI and generative AI to enhance production capabilities.
Things like a blur effect to give a cinematic effect or auto cropping are now integrated within the XtraMotion system, says Verlaine of a system that is most well-known for taking 60-fps content and generating additional frames to make the content a super-slo-mo clip.
Redfern explains, The same thing that we do with the super-slow-motion we can do with the cinematic effect, and we also have a sharpening tool as well. Obviously, there's a lot of movement in the cameras in live sports, and, when you're dealing with sponsors who want their logos to look as sharp as possible, the sharpening tool is a really valuable tool for people now.
The use of AI to help creatives lifts that technology beyond automating playout of material and other mundane tasks. Instead, the goal is to help create content that is more engaging and allows creators to expand their output and creativity.
Those on the creative side need to see AI as being assistive rather than as something to replace people in certain areas, says Redfern. We may reduce the number of human eyes that need to be on content, but we will still need humans to validate the content.
Not everything is suited for AI, however. Take a zoom function, which can allow the user to extract a 1080p portion of a 4K image. EVS demonstrated it at IBC, and it is expected to be rolled out in 2025.
We tried AI with that, says Redfern. The feedback we got from the market was that it needs to be manual. For something like that, the user needs the key frames and zoom exactly where they want them.
The next frontier around AI and generative AI is to continue the efforts around machine learning. If people want better content, you need to have machine learning of that content, Redfern explains. With more content, you can get a more detailed output. A question for all of us is, do the leagues want to share the content for machine learning? If they share it with EVS, do they also have to share it with other vendors? How do we make machine learning better?
The use of AI within media-asset-management logging is an example of its rapid development. Both facial recognition and natural-language searching are available to users as customers transition to XT-VIA. EVS's acquisition of MOG Technologies this past summer will also improve EVS ingest workflows, and solutions integrating technology from both MOG and EVS are expected to roll out around NAB 2025.
[MOG has] the mCODER transcoding engine and the MAM4PRO SAS-based asset-management system as well as the Vizzi OTT content-distribution platform, says Redfern. All of that is built in the same software stack so we're looking at components to see how we can boost the media infrastructure. They also support a much wider range of codex and formats, and we are very much Tier 1 SDI and ST 2110 while they are in SRT and those sorts of tiers.
In the coming months, EVS and other equipment manufacturers are expected to continue to evolve and expand their capabi










