Martin Izzard, head of media and entertainment at Red Lorry Yellow Lorry takes a look at the rise of virtual productionBy Contributor
Published: May 14, 2021
Martin Izzard, head of media and entertainment at Red Lorry Yellow Lorry takes a look at the rise of virtual production
target=_blank title=Share on LinkedIn class=share-linkedin>
The last few weeks have seen both the Real Time Conference and FMX take place; albeit virtually. The first is a technology-focused event that covers a number of different industries (VFX included) and the latter a VFX-specific show with panels and sessions that are increasingly being dominated by talk about the deployment of real-time technology. That is, engine technology that can render digital assets, whether it's environments, characters or animation in . well, real time.
Talk to any VFX studio or supervisor about the biggest cinematic or episodic projects from the last few years and you'll find the use of real time technology. It's also increasingly being deployed for smaller productions and has long been the basis for mixed reality experiences which arguably set the scene for some of the techniques that are now becoming standard in the real time-enabled VFX toolkit.
I spoke to some of the experts within technology vendors, VFX studios and service providers about the impact that real time is having on today's VFX industry. Like any emerging technology, it's in a constant state of change and development, but one thing that real time has quickly become synonymous with is virtual production, or VP.
The rise of virtual productionEver since Jon Favreau and the ILM StageCraft team tackled the latest trip into a galaxy far, far away with a lot of LEDs and cameras that respond to digital environments in real-time, filmmakers everywhere have wanted to use this so-called virtual production on their next project.
But Ben Lumsden from Epic Games, the company behind the Unreal Engine, says that it's important to make a distinction between in-camera visual effects, and other forms of virtual production. Thanks to its background, he says Unreal Engine has been set up to do all sorts of forms of virtual production for about eight years, but it just keeps getting better and better at accommodating film, TV, and animation workflows.
As everyone is now realising, there's much more that VP can offer other than just this particular technique. Pascal Achermann is the CEO & technical director at VRFX Realtime Studio, a Swiss studio which uses the Unity engine to create mixed reality content and deliver virtual production services. According to his definition of VP: any time you are using real-time to support your animation, film or commercial production or storytelling, you are producing virtually.
That means, depending on how you deploy real-time technology, the impact it has and the benefit it provides to your project will differ. Ed Thomas, the head of real-time and virtual production for volumetric video capture studio Dimension, says that VP allows filmmakers to create living, breathing worlds at much earlier stages in development than has previously been possible.
He adds that it means each filmmaking discipline has access to key decision-making moments much earlier in the process and in a more collaborative environment [which] democratises the creation of this content.
This interesting prospect that technology innovation doesn't just benefit the filmmakers but everyone around them shows how widely real time technology can be deployed. This might be a production designer who can test colours on the wall of a set without needing to get out a paint brush, or a lighting gaffer who can see different lighting configurations at the touch of a button. It also means that actors can now better visualise a world or a digital character that they otherwise wouldn't have seen until the premiere.
data-src=https://www.tvbeurope.com/wp-content/uploads/2021/05/09_The-Bourne-Stuntacular-1-726x484.jpg alt= width=413 height=275 data-srcset=https://www.tvbeurope.com/wp-content/uploads/2021/05/09_The-Bourne-Stuntacular-1-726x484.jpg 726w, https://www.tvbeurope.com/wp-content/uploads/2021/05/09_The-Bourne-Stuntacular-1-353x235.jpg 353w, https://www.tvbeurope.com/wp-content/uploads/2021/05/09_The-Bourne-Stuntacular-1-768x512.jpg 768w, https://www.tvbeurope.com/wp-content/uploads/2021/05/09_The-Bourne-Stuntacular-1-1536x1024.jpg 1536w, https://www.tvbeurope.com/wp-content/uploads/2021/05/09_The-Bourne-Stuntacular-1-2048x1365.jpg 2048w data-sizes=(max-width: 413px) 100vw, 413px />The Bourne StuntacularIn recent times, VFX studios' worlds have expanded far beyond film and TV screens to include attractions and immersive experiences where real time is really widely used. For example, Cinesite recently won a VES award for its work on The Bourne Stuntacular live show at Universal Studios, Hollywood, which takes place in front of a 130-foot-long LED screen. Unreal Engine was used to render 3D environments throughout production, providing real time feedback and the flexibility to adjust timings and layouts on-site. Without a real time engine in the mix, Cinesite's Salvador Zalvidea said the project would have been a painful and frustrating experience.
Benefits and challengesWhat real-time technology allows is for creatives to experiment with ideas and make decisions early on that would otherwise take weeks to visualise with traditional VFX rendering. On The Bourne Stuntacular project for example, the use of real time for previs meant the team could use a headset to review their work as soon as it was completed in Unreal.
data-src=https://www.tvbeurope.com/wp-content/uploads/2021/05/15_The-Bourne-Stuntacular-Technology-726x545.jpg alt= width=416 height=313 data-srcset=https://www.tvbeurope.com/wp-content/uploads/2021/0










