SVG Sit-Down: NVIDIA's Jamie Allan on the Transition to ST 2110, What's Next for AI, AR, and the Cloud The focus is migration from client-based system to software-defined virtualized infrastructure By Jason Dachman Tuesday, September 27, 2022 - 1:57 pm
Print This Story | Subscribe
Story Highlights
At the IBC Show in Amsterdam this month, NVIDIA's tech infrastructure once again powered hundreds of booths. In addition, the company demonstrated SMPTE ST 2110 workflows at the Dell Technologies and RED Digital Cinema booths.
The next-generation IP broadcast workflow at the Dell booth focused on how to simplify the adoption of SMPTE ST 2110 standards for the broadcast industry. NVIDIA and Dell teamed up to showcase IP-based content-creation capabilities and deployment of AI in the broadcast pipeline from workstation to the edge.
At the RED booth, NVIDIA networking technologies (Rivermax, ConnectX, NVIDIA BlueField DPU, NVIDIA RTX GPU) enabled real-time 8K raw video over ST 2110. In this demo, NVIDIA and RED showcased a direct connection that allows cinema-quality RED V-RAPTOR 8K content to feed into an IP broadcast-production workflow.
During the show, SVG sat down with Jamie Allan, lead, media, entertainment & broadcast industry, EMEA, NVIDIA, to discuss the ST 2110 demos and how NVIDIA is helping power major next-gen technologies, such as artificial intelligence (AI) and machine learning (ML), augmented reality and immersive experiences, and cloud- and edge-based workflows.
NVIDIA's Jamie Allan: The broadcast industry should build in a way that allows [deployment] on any platform in the future.
What are the big themes NVIDIA is highlighting this year at IBC?
Firstly, we're excited to be back at IBC after being away the past few years. Getting together with our amazing ecosystem of partners in Amsterdam is always great. Having NVIDIA in hundreds of the booths, powering the solutions and technologies that make up the media and entertainment world, is a great honor for us.
This year at IBC, we are focused on talking about and demonstrating some of our groundbreaking solutions that simplify the adoption of ST 2110 workflows for broadcasters, postproduction companies, and large media organizations. These [solutions] enable these organizations to easily bring SMPTE ST 2110-compliant uncompressed streams into their infrastructure without a huge engineering uplift.
Can you provide some detail on the key demonstrations you're participating in here at the show?
At the Dell booth, we are showing how our existing Rivermax SDK - which is already used by many leading broadcast organizations, such as Grass Valley and Disguise - can create a new Windows application that provides a virtual display for a 2110 platform. You can take your normal Windows virtual or physical workstation and virtualize a SMPTE ST 2110-compliant desktop as a second display. You can simply drag an application to your second display and send that out to broadcast live on-air as a 2110-compliant stream.
On the RED Digital Cinema booth, we are enabling the world's first real-time camera-to-SRT stream via uncompressed ST 2110. We've worked with RED to develop the capability to go straight from one of their new camera models, fire an IP module into a processing unit doing uncompressed 8K 2110, and take that stream into either an uncompressed pipeline or an SRT compressed webstream at full 8K. We believe this is the first time that has ever been done.
How do you see AI and ML changing the way live sports are produced? And what role is NVIDIA playing in that evolution?
The broadcast industry as a whole has adopted AI on a much larger scale, and we've seen many broadcasters using these tools over the past few years.
Organizations like EVS, Sony Hawkeye, and Vizrt are advancing their tools with AI. And many broadcast organizations and media companies are investing internally in AI data-science teams and developer teams to take some off-the-shelf AI tools that you can get from places like NVIDIA's GPU container cloud and retrain and adapt them to create specific tools for their needs.
That is very important in the sports industry because of the complexity and the unique needs of each individual sport. We work very closely with organizations like Hawkeye to enable their tools to work specifically for certain sports.
I also think AI and machine learning in automated production is very interesting. We are seeing organizations like Pixellot and Mediapro's AutomaticTV growing at an astonishing rate. We will continue to work with these companies to create smaller and faster components for that part of the ecosystem so that technology can continue to grow.
There are also many startups focusing on creating groundbreaking applications for AI in sports broadcasting. One company in particular doing groundbreaking work in markerless motion tracking is move.AI. They are a UK-based company who have [drawn interest] from many major sporting bodies and broadcasters around the world. Using low-cost cameras, they can create full-body 3D visualizations, which is something every broadcaster wants in order to add value not only to their current 2D broadcast pipeline but also to their future immersive metaverse and Web3 broadcasting capabilities.
Speaking of the metaverse, how have you seen the use of augmented reality grow in recent years? How to you envision these virtual technologies impacting the industry?
We're incredibly proud that nearly every vendor who creates augmented-reality and virtual-graphics tools leverages NVIDIA technology to build their products. We continue to push our engineers and our internal product teams to give them more and more capability in that space. The next step that we are hoping to see is bridging the gap between AR in the studio and AR in the home. We&










