Live From Tokyo Olympics: Darryl Jefferson and Jim Miles On NBC Olympics File-Based Workflows and Storytelling By Ken Kerschbaumer, Editorial Director Wednesday, August 4, 2021 - 5:36 am
Print This Story
Every NBC Olympic effort sees massive changes and advances with respect to file-based workflows, editing, and more. Toss in UHD, HDR, and immersive audio and those advances are even more challenging and, ultimately, impressive. Darryl Jefferson, NBC Olympics VP, Broadcast Operations and Technology and Jim Miles, NBC Olympics, Director, Digital Workflow Systems discussed the multi-continent, timezone, and facility effort with SVG.
Darryl Jefferson (left) and Jim Miles and the team have worked hard to allow creatives to focus on storytelling.
Can you describe the ecosystem here?
Miles: It's Avid Media Composer for craft editing and Avid Interplay MAM is the record apparatus for highlight shot selection as well as our archive. The entire Olympic archive is Interplay and Media Central and our playback turnaround is EVS. We do a lot with Telestream for transcode, flipping, and orchestration and we use Signiant as our file mover and for transfers from the venues and to Stamford.
Jefferson: We also have a new ingest device from Telestream called Live Capture which can capture to 1080p HDR content as well as the older flavors of content. And our big monster storage is Dell/EMC Isilon.
Miles: The interesting story on the editors this time is that we are still using hard workstations for the primary craft edits, but all our auxiliary edits are virtual machines. We used to have to bring 30 Avids to the IBC but this year we only had to bring a dozen and we have the VMs for the producers and those lighter weight tasks. That's been huge for us in terms of the complexity of what we must build.
You have teams around the world diving into your file-based workflows. Are the workflows the same everywhere?
Miles: More or less. We try to put the high resolution recordings where it's needed. If somebody is doing a turnaround at a venue, we'll put it right there next to them in their local storage at the venue. And if something is needed for prime time, we can move the content here to the IBC.
But our main record apparatus has moved back to Stamford and when we think about the hundred feeds that are coming in for different venues, from the host and from our own, all of those go back to Stamford and are recorded there where the bulk of our ancillary users are. Then there are other business units in Miami, in the news organization at 30 Rock who are all pulling from that recording wall back in Stamford.
Jefferson: The other thing is the added wrinkle of HDR for the prime time show and for our venues. That adds a layer to know which version of a recording you have and if we're doing parallel recordings or is it an SDR sport contributing into an HDR primetime show or vice versa. We try to normalize the content for the end-user user so that complexity is obscured, and people have to think about it less.
Miles: We have over well over a hundred different paths, many of which have file conversion or interlace to progressive or progressive to interlace conversion in the middle. And it it's been an interesting challenge to both build it all ahead of time and then get it running in a matter of days when we turned it on.
Jefferson: And it's been an education for our legion of freelance editors and freelance operators in general. They need to wrap their heads around where is the recording that is closest to the output of my show. Or if someone's delivering into a show that's different than the format, they're normally cutting in it takes a little while to kind of get up to speed.
But we do have islands of 1080p HDR like at the venues and once they're at the venue they have to worry much less about their environment. They only need to worry about the outliers like an ENG camera coming in or an interview or reaction shot from another broadcaster that won't be in 1080p HDR.
Miles: Content is a great example where we have that fire hose of content coming in from OBS and, depending on who's pulling it, some of them will just take it natively and some are taking it with a LUT (Look Up Table), or some are taking it with a transcode depending on where it's going.
The other day you mentioned having folders that would automate some processes. Can you discuss that a little more?
Jefferson: The folders basically illustrate to us where the content is coming from, where it is going to, and illustrates to us what process it undergoes. For instance, there's a set that takes things from progressive to interlace or applies a LUT or up/down conversion or normalizes audio and so forth. So, the folders are basically instructions of what is going to happen, and we can see at a glance what happened to the file and what direction it is flowing, etc.
And is there an undo in case someone gets it going the wrong direction?
Miles: There's not exactly an undo function an undo function for a LUT, for example, but we are able to go back and put it through the correct workflow. All the folders that Darryl mentioned are lashed into different production systems like our EVS system, our Telestream Vantage system, our Avid system and all the things coming from external sources, be it ENG or Content or drives coming in. They all kind of flow together and we don't just throw things away after we process them; we hang on to them. And if some inbound source comes in incorrectly from a partner, for instance, we could go back and get the original file and put it through a different process to get it the way we need it.
Over the past year there has been a lot of talk in the industry about how editors, for example, don't need to be on site. What do you see as the benefi










