Creating HERE - Utilising AI De-Aging Tech Caroline Shawley February 24, 2025
0 Comments
Forrest Gump director Robert Zemeckis has once again teamed up with Tom Hanks and Robin Wright for his latest movie, Here - based on the graphic novel by Richard McGuire.
The movie uses an AI-assisted aging and de-aging process to convey the lifetime of a married couple, played by Hanks and Wright. The film, which was shot from one camera position and angle that jumped back and forth in time, brought unique challenges to the production and post team - Don Burgess, cinematographer; Kevin Baillie, VFX; and Harbor senior colourist Maxine Gervais.
A team reunited Here not only reunited actor Tom Hanks with actress Robin Wright, it also brought back together the team working behind the scenes - Gervais' first collaboration with Burgess was in 2010, on The Book of Eli, and Gervais, Baillie and Zemeckis's team also worked together on Welcome to Marwen .
Don is a very professional man who has been doing this for a long time, says Gervais. I remember at the beginning of the movie there were rumours that it would be easy' for us all, as it was one point of view' with the camera not moving. That made us giggle, as we knew there would be no such thing as easy'.
Don and Kevin are so efficient at executing Zemeckis's visions - they are a well-oiled machine, adds Gervais. They are very creative in finding new ways to do things that haven't been done before. Kevin is one of a kind, always pushing boundaries and ahead of his time for sure - he and Don work in harmony.
Early discussions and testing Cinematographer, Don Burgess, began concept discussions and camera and lens test shoots in June 2022, along with director Robert Zemeckis, VFX supervisor Kevin Baillie and colourist Maxine Gervais.
Bob Zemeckis expressed the idea of locking off the camera and shooting from one position on the planet, said Burgess. Most of the movie was in the hero house and some before the house was built or would fade in as characters would enter the house. The set was still in design, but we had a good idea of size and where the window and front door would be.
It took many hours of trial and error to find the perfect spot from where to tell the story, adds Burgess. We had to talk through and test every scene before we started shooting. The lighting was designed for every hour of the day and every day of the year. The weather was also talked about with Bob - will there be cloud, sun, rain, sleet or snow? Once again, all worked out before we started shooting.
Following thorough testing, Burgess picked the Panavision 35mm P70 series and the RED Raptor Camera.
With VFX so heavily involved in the movie, Kevin Baillie was also part of these very early discussions.
I was invited into discussions before Bob had even finished writing the script with Eric Roth, recalls Baillie. Different directors deal with visual effects in different ways, and Bob happens to be one that considers visual effects a key component of the process. This is not only fortunate and fun, but it also allows us to plan how best to shoot the movie.
Often, people think of visual effects as merely a process that happens in post-production, whereas really good visual effects are considered as a tool to aid in the course of production, adds Baillie. Especially when you're doing something new or extensive like we were for this film. We had approximately 43 scripted minutes of de-aging that went as far as to require a full digital face replacement of our four lead actors. And this ended up expanding to 53 minutes in the final film.
We knew we weren't going to be able to accomplish this through traditional face replacement techniques with CGI, because it was going to be too expensive and time consuming. And it would also be difficult to maintain consistent quality, so we knew we would have to rely heavily on machine learning and AI-based techniques. We spent time doing the necessary diligence to figure out what techniques would work and what vendors we could partner with - something that we absolutely had to do before the shoot.
LED wall The team shot the interior of the house on two sound stages in Pinewood, and the window exterior was an LED wall portraying over 80 different eras, weathers and times of day in the neighbourhood.
The LED wall required a lot of prep work, comments Baillie. We used Unreal Engine to create the world outside and, because we had a real-time environment out the window, Don was able to adjust the lighting to match the time of day or weather to accommodate what he wanted to see in the story. He could then adjust his practical lights to match what was happening outside.
Most of the shots had the LED wall showing exterior background out the window, comments Burgess. The wall performed best at a cool colour temperature, so we set up the camera at 1600 ASA, 4300K and exposed at T5.6. The LUT was set up on set with Maxine and our DIT, Chris Bolton.
Graphic panels The story was adopted from Richard McGuire's graphic novel, which made heavy use of panels (a picture-in-picture effect, where parts of the image are showing different scenes at different times of day or time).
Twists in the fabric of space and time 1964 from McGuire's Here. Source The Guardian
Zemeckis used this same graphic novel/comic book iconography for the film, which brought specific challenges for the team.
The transitions between scenes were primarily done through graphic panels, in the style of the original graphic novel, says Baillie. So, we could have multiple scenes playing on screen at any one time and juxtapose two moments that were spiritually or thematically connected' - which is something that you can't really do in a traditional movie.
The process of creating and










