
Monday, September 19, 2022 - 11:26 am
Print This Story | Subscribe
Story Highlights
Vizrt was one of the companies that showed up at IBC transformed since IBC 2019. The acquisition of NDI, which was fresh in 2019, is now fully integrated and baked into the Vizrt environment; the cloud and virtualized tools continue to gain interest from customers; and the company even announced acquisition of Flowics at the show. During the show, SVG caught up with four members of the Vizrt team: TJ Akhtar, VP, product management, Vizrt; Suso Carrillo, NDI marketing lead, Vizrt Group; Amisa Saari-Stout, communications specialist, Vizrt; and Ulrich Voigt, VP, product management, Vizrt Cloud Solutions, Vizrt. Below is an edited transcript of the conversation.
(l-to-r) Vizrt's Suso Carillo; Ulrich Voight, Amasi Saari-Stout, and TJ Akhtar at IBC last week.
What are the big storylines for Vizrt at IBC 2022?
Akhtar: From a group perspective, there are a lot of new things. At NAB, we talked a lot about the cloud piece and new developments on the Viz Engine side as well as NDI. At our Vizrt days, we had a preview of two big things: the cloud piece and the release of Viz Engine 5. Fast forward a few more months, and we now have NDI version 5.5 and then the acquisition of Flowics in the past few days. So it has been a very busy period for us.
Can you give us an update on the Vizrt engine?
Akhtar: A couple of the big pieces of Engine 5. We rewrote the integration with Unreal Engine 5, which was released earlier this year. That comes with a lot of new features like Lumen for the lighting piece, and you can build your own virtual world and mass scale with Nanite [the virtualized geometry system in version 5]. We want to utilize all those features and rebuild our integration from the ground up so the features can be used natively, including the retracing. We can control from Viz Engine and our control application pretty much every single element as part of an Unreal Scene.
On top of that, we also added asset management of Unreal Engine Scenes, which can be multiple gigs. We've added support for Unreal within our Graphics Hub, which is a massive piece and gives redundancy when collaborating and security of your traffic files.
We also have adaptive graphics for adaptive storytelling where content is delivered on multiple platforms and over multiple target points. And we've actually passed the inflection point where consumption is on digital vs. linear broadcast, so we want to make it easier for broadcasters to create graphics that can be used and played out via multiple sources and whether it is 16:9 or 9:16 for TikTok or square for Instagram and Facebook. Adaptive graphics enables all of that, and our broadcast partners can design a graphic once and use it multiple times without having to duplicate the design.
Let's talk NDI 5.5. What's new, and what does it mean to the marketplace?
Carrillo: The key point with NDI 5.5 is that the focus is on NDI Tools, which is a suite of free tools that have features like multi-audio management, an NDI switcher, and some other capabilities.
NDI now has a lot of flexibility, and that's why we decided to improve the features so that everybody can have a multichannel-audio workflow.
Akhtar: Just to add to that: one of the benefits of NDI 5.5 as part of Viz Engine is the insertion of metadata within the NDI stream. That enables the graphics piece to do multi-camera remote production purely on NDI. That is enabled by new features that will be coming in a later release of NDI. When it comes to live production, the improvements move us towards live production in the cloud.
Let's talk about the cloud. Vizrt has been doing virtualized hardware for a long time. How does that help with customers moving production to the cloud?
Voigt: It's an easy step for us to go into the cloud and with some optimizations. The first step is that all our products are now running in the cloud so you can do Engine in the cloud, you can do switching, you can do replay, you can do sport analysis and even virtual ad overlays. And NDI is the fabric to make it possible and combine with on-premises equipment. For sports analysis, we can now get live camera feeds from the stadium and then do all the analysis and illustration creation in the cloud live and then get the signal back into the production.
The trick is, how do you make that accessible to the general user without the need for cloud engineers who are focused on security and the virtual environment? That is one reason we developed a cloud deployment platform that can set up the infrastructure and Vizrt tools virtually within 15 minutes after a template is selected for a type of production. All the user needs to do is connect to it, and they can do their work. We can still do remote production with hardware set up in two locations, but, with the cloud, you can build the whole production gallery in a matter of minutes by just selecting a template. Then the team has full control and can tweak it.
But key is that the cloud is just another infrastructure; it's not another product. And that makes it seamless: the user can have half of it in the cloud and half of it in the OB van. They can connect together and have the same workflow and create graphics and use them everywhere. Or they can create a show rundown in Mozart to automate the show in the cloud or in hardware.
The remote accessibility of NDI gives you a lot of new workflow possibilities. A remote commentator can use free NDI tools and NDI bridge to connect to the cloud, and everything is based on current workflows, so they don't need a cloud-enginee