TVBEurope talks to the team behind a proof of concept looking at a standardised data pipeline allowing content makers to create 3D animated content quickly and collaborativelyBy Jenny Priestley
Published: January 26, 2022 Updated: January 27, 2022
TVBEurope talks to the team behind a proof of concept looking at a standardised data pipeline allowing content makers to create 3D animated content quickly and collaboratively
target=_blank title=Share on LinkedIn class=share-linkedin>
During its upcoming Production Technology Seminar, the European Broadcasting Union will be revealing more details of its proof of concept around using remote production for real-time animation.
As part of the IBC Accelerators programme, the proof of concept team developed a solution that explores a standardised pipeline that enables users to employ everyday technology such as smartphones and web cams as well as real-time render engines to create 3D animated content quickly and collaboratively.
The project has been led by the EBU's senior project manager Paola Sunna alongside RT motion graphics designer Ultan Courtney. Early last year they came up with the idea of building and designing a low-cost pipeline for computer generated characters, leveraging game engines, AI tools and off the shelf equipment.
We got in touch with other broadcasters, including RAI, YLE, and VRT and told them about the project and what we wanted to try and test, explains Sunna. They came on board and then we submitted a proposal to the IBC Accelerator programme, it was accepted and so we started working with other champions and participants.
Just before the team submitted their PoC, Nvidia launched Omniverse, a scalable, multi-GPU, real time reference development platform for 3D content creation. So we agreed with the other broadcasters to develop two pipelines, continues Sunna, one based on the beta version of Omniverse, and the other based on Unreal Engine. The team at RAI focused mostly on the Omniverse platform, which includes a solution called Machinima that is for cinematics, so animation, manipulation of the CG characters using high-fidelity renders. They also used Audio2Face, which allows you to generate facial animation from just an audio source. You record the audio clip and it's used to animate the face of the CG characters.
data-src=https://www.tvbeurope.com/wp-content/uploads/2022/01/RADICAL_MOCAP_WEBCAM_IPHONE_v1-726x408.jpg alt= width=445 height=250 data-srcset=https://www.tvbeurope.com/wp-content/uploads/2022/01/RADICAL_MOCAP_WEBCAM_IPHONE_v1-726x408.jpg 726w, https://www.tvbeurope.com/wp-content/uploads/2022/01/RADICAL_MOCAP_WEBCAM_IPHONE_v1-353x199.jpg 353w, https://www.tvbeurope.com/wp-content/uploads/2022/01/RADICAL_MOCAP_WEBCAM_IPHONE_v1-768x432.jpg 768w, https://www.tvbeurope.com/wp-content/uploads/2022/01/RADICAL_MOCAP_WEBCAM_IPHONE_v1-1536x864.jpg 1536w, https://www.tvbeurope.com/wp-content/uploads/2022/01/RADICAL_MOCAP_WEBCAM_IPHONE_v1-241x136.jpg 241w, https://www.tvbeurope.com/wp-content/uploads/2022/01/RADICAL_MOCAP_WEBCAM_IPHONE_v1.jpg 1920w data-sizes=(max-width: 445px) 100vw, 445px />The team wanted to decentralise the production pipeline by relying on what audiences had already in their homes or have access to. A lot of our project, believe it or not, involves saying no to potentially better and more expensive technology, states Courtney. By sort of building on the model of almost having a connected series of home offices, we were able to connect all of the artists who were working internationally on the project. Often you're expected to have a kind of generalist skill set when you work with a broadcaster and some of this required some specialist knowledge. *But at the same time, we also needed almost to have access to a few specialists and when we came together we realised we had everything we needed to form a very well educated team.
The focus on collaboration and remote workflow meant the team was spread right across Europe, with 3D modellers in Belgium, an FX artist and vocal performer in Finland, innovators in Italy, and a sound designer in Ireland. One of the team at the EBU set up a live working mechanism for cloud based workflow through a system called Perforce, adds Courtney, RT were directing and doing performances as well and then RAI were testing tools and refining and cleaning motion capture data.
Even though we all had a generalist skill set, each one was able to kind of push it slightly enough towards a specialisation without it turning into a full Hollywood pipeline. It was essentially trying to find a way to create a usable pipeline within the skill sets of broadcasters and also making it as accessible as possible to talent who would be digitally captured and then perform using their voices.
data-src=https://www.tvbeurope.com/wp-content/uploads/2022/01/AUDIO2FACE__RECORDING_TALENT_v1-726x408.jpg alt= width=461 height=259 data-srcset=https://www.tvbeurope.com/wp-content/uploads/2022/01/AUDIO2FACE__RECORDING_TALENT_v1-726x408.jpg 726w, https://www.tvbeurope.com/wp-content/uploads/2022/01/AUDIO2FACE__RECORDING_TALENT_v1-353x199.jpg 353w, https://www.tvbeurope.com/wp-content/uploads/2022/01/AUDIO2FACE__RECORDING_TALENT_v1-768x432.jpg 768w, https://www.tvbeurope.com/wp-content/uploads/2022/01/AUDIO2FACE__RECORDING_TALENT_v1-1536x864.jpg 1536w, https://www.tvbeurope.com/wp-content/uploads/2022/01/AUDIO2FACE__RECORDING_TALENT_v1-241x136.jpg 241w, https://www.tvbeurope.com/wp-content/uploads/2022/01/AUDIO2FACE__RECORDING_TALENT_v1.jpg 1920w data-sizes=(max-width: 461px) 100vw, 461px />Once recorded, the voices were entered into a solution called Audio2Face which was employed in the Omniverse pipeline. We were able to take in people's voice data, it would animate the face and then we were also able to use a participating company called Resepeec










