VFX supervisor Paul Lambert on his second collaboration with Denis Villeneuve, their philosophy of shooting for VFX, and how AI tech will transform everything
Artists behind the visual effects of Dune have been nominated for five VES awards, and Paul Lambert, Tristan Myles, Brian Connor and Gerd Nefzer have been nominated for both an Oscar and a BAFTA for their exceptional work on the feature.
In the following Q&A Paul talks about how his preference for understated visual effects combined with director Denis Villeneuve’s collaborative, organic style of filmmaking – and dislike of green screens – to produce a special philosophy of how they would shoot for visual effects. On Dune, the ‘figure it out in post’ mentality was nowhere to be seen; instead they sought to take full advantage of all the amazing talent they had on set to come up with creative practical filming techniques that would drive the success of the visual effects in post.
Paul also talks of his fascination with AI technologies, which he believes will transform every aspect of how visual effects are done, and his plans for AI passes in future films.
Tell us about your working relationship with Denis Villeneuve and why your collaboration has been so successful on two films now.
Denis is a visionary director, he basically had the movie in his mind. He knows exactly what he wants so there’s never a time when we’ll decide to figure something out in post – he already has the visual in his head. That makes things far more efficient, and you get time to finesse rather than trying to work out what the base idea is going to be.
For example, Patrice Vermette spent the better part of seven months coming up with the beautiful concept art for Dune, and usually concept art is a springboard for additional ideas, but Denis was so happy with what had been produced that we built the sets and the flying machines exactly to those concepts. He always lamented on Blade Runner that he allowed things to slip from him. Part of the process in visual effects is that when you are trying to match a piece of content, things always slip. The 3D model tends to deviate over time. But Denis was adamant that what he had seen in those pictures was exactly what we were doing.
Pre-production on Dune was one of the most collaborative experiences that I’ve had on a movie. We would all meet up everyday to go through different ideas and come up with the best way of shooting things to be successful in post. He allowed us that time to figure things out. And he’s not a fan of blue or greenscreen, so we had to come up with different ways to actually do this work. He is an absolute joy to work with, and everyone has such high regard for him that it’s such a happy set – there are no big egos.
He and I have very much the same sensibility, which is that we don’t like flashy visual effects. I like to keep the visual effects work as hidden as possible, so you don’t know we’ve touched it. And Denis has a very grounded, real approach, so it works for both of us.
What were the challenges of shooting in the desert?
It’s hard because of the heat, and because you can’t take all of the equipment you would like. We tried to use nature as best we could. For example, in one particular scene Patrice had built one leg of the spice crawler and our director of photography Greig Fraser had positioned it in such a way that the sun would rise and set over the axis of the vessel leg, keeping the shadow consistent throughout the day – which meant that we could keep shooting for much longer.
For the night work, we would shoot towards the end of the day and then it was graded afterwards for a twilight look. You can’t shoot at night in the desert because it’s pitch black, and you can’t light it because it’s a massive expanse.
Denis has this saying that nature will supersede the storyboards which will supersede the previs. So when you turn up to a particular set, if it feels that things will go in a slightly different direction, you’d better be prepared because you’re going to be thinking on your feet trying to work out how to change things up so it actually works. It’s quite exhilarating working in that way because you are always thinking on your feet. I love it because it’s very organic, although it can be petrifying!
Is it fair to say that your way of working with Denis means you used less visual effects than you usually would?
Yes. It’s a philosophy of how to shoot for visual effects. Rather than putting somebody in a blue box, we’re saying, how can we take advantage of all this talent and experience on set right now? The amazing DOP, the amazing director, the amazing production designer, the amazing special effects, all the amazing departments. It’s about coming up with the best solutions on set which will then really drive the success of the visual effects.
So all these different procedures we came up with – shooting the interior of the ornithopter, shooting in actual sunlight, shooting with sand screens instead of blue screens – all these different techniques were created so that we would then be successful with the visual effects.
We do have a number of shots which are all CG, but they are usually surrounded by shots where we’ve had something substantial on set. That’s why we built those ornithopters and shipped them out to the Jordanian desert and Budapest, because having something real like that in camera makes such a difference. We would pick those things up with cranes and DNEG would add CG wings to them. Then when they flew too high, we would then take over with a normal CG version. But having that on set and through the camera means that you’re just copying until it matches the plate. And when it does match the plate, you have something seamless. Everybody was on board with this idea. I have not experienced that before, how smoothly things ran.
Was there anything in particular that you would usually expect to do with visual effects that you ended up doing with practical techniques?
There was the question of how we would shoot the scene with Paul and the hologram. I was worried that to get the kind of interactive light on his skin and clothes would mean that we would have to create a CG version of Paul, and a really realistic one, to get that kind of subsurface that you see on skin when you have a light on it. I was quite apprehensive about that because I know it takes a long time, having done it a couple of times before.
What we came up with, with the help of James Bird (of DNEG London) and Mag Sarnowska (one of my onset artists), was the idea to cut up the holographic bush into thin slices and project them onto Paul on set. We were interactively tracking where Paul was so that when he moved a different slice was projected on him. When he moves forward and all the slices are changing it actually feels as if he’s penetrated into the branches. And what you get is a light source actually being projected onto his face, his eye and his clothes. You do get a mess of light on the wall behind Paul, but the gold here is that you’ve actually got that interactive light onset, and that allows Greig to reposition his camera and Timothée to reposition his body to get the best lineup rather than trying to figure that all out in post. Things can get a little bit more clinical in post, and having something tactile on set was the key to making it successful. So having gone down that path was really rewarding because it achieved the goal of getting the true interactive light. Then in post, it was just cleaning up the mess of light on the wall and putting in the rest of the CG bush around him, which gave us the visual of him interacting with the hologram. Even though DNEG London played a major role for the technology of the shoot for this sequence it was Wylie Co who ended up taking on and finishing the sequence. The shoot is peppered with different ideas like this, things we’ve done to try to give us the best basis for our post work. I find that gives such a different feeling to the work, it’s far more organic.
On Blade Runner the visual effects work was spread out over a number of vendors, but on Dune a larger share of the work was done by DNEG, the lead vendor. What are the advantages and disadvantages of the two setups – having a lead vendor doing most of the work, versus having it spread out more evenly over a number of vendors?
I think that having one main vendor is problematic. Fortunately the way DNEG split the work for Dune was it was done in two separate locations. There were assets that were shared, but the two locations worked on different sequences and were very separate, so it was like having two vendors.
I’m not a fan of having just one vendor do it all on such a big show, because what tends to happen is that you might have 15 very big sequences and they are scheduled to be worked on sequentially – so work only starts on each one when the previous one is finished. That’s all well and good until you need material to take to a director’s review and they’re still held up on a previous segment. It’s difficult to explain to the director, and you’re getting pressure from the studio because they want to be able to see something to see if it works.
You usually get the facilities to do the temps, and that tends to stop progress on the main body of work. On Dune I did things a bit differently in that I had the in-house company, Wylie Co, work on the majority of them. They were able to get all these temps out super quickly and we were just doors away from Denis’ office, so we were all working very closely. Having that instant feedback from the director is so beneficial, rather than work going off to the facility and their version coming back a few days or weeks later. It also allowed DNEG to carry on working with the heavy lifting of the complex VFX shots rather than changing gears (and resources) to do the temp work. We really took advantage of that.
You can talk for a long time about scheduling and how the delivery of shots occurs very close to the deadline – it’s quite scary to receive hundreds of shots in that timeframe. So I prefer to split the work up, whether that’s with different sites at the same facility or different facilities.
What are you hoping AI technology will do for you in the next few years?
I think it’s going to change every single aspect of visual effects. Part of our process on set is to capture HDRIs, chrome balls and other references, and I’m already planning to be doing AI passes as well as that on future films. It’s fundamental to every single aspect of what we do, whether it be tracking, being able to build up geometry based off of multiple cameras, or any number of other processes. We already gather such an enormous amount of data on set from reference photography to footage and scans of actors, sets and props that with a bit more direction in how we capture this material we can train models for a multitude of purposes.
I ran some tests recently doing some face replacements, and the replacement actually kept the original color and lighting from what I was swapping. This is a game changer. We’re going to be working at a different level with our stunt face replacements because this way allows us to not have to match the lighting specifically.
I check on papers every single day to see what’s just come out for this. I’ve been so busy for the past five years it has kind of passed me by, and now I’ve just spent the last year being obsessed with the whole process; it’s just absolutely fascinating. I think our current understanding of this is almost pre-Newtonian. It’s that early. We have these black boxes that are able to figure out the parameters we need and we’re only just starting to realise what their full potential is. I can see a day where we will be able to take photography and be able to change the overall key light. It’s something I’m keeping an eye on and running various tests.
But at the same time as all of this, I work with people like Denis and I know that there’s always going to be a human, organic touch to his work. There’s never going to be a time when I say to Denis, let me just put somebody in a blue box and we’ll figure it out. So it’s about working with these advancements in machine learning, and then thinking about how to apply that to, in essence, traditional filmmaking – taking the best bits, but keeping those human, interactive components. It’s still a tool, and it doesn’t drive the actual story. It’s a tool in service to the story – that’s how I’m trying to approach it.
Would you say that some of these new technologies are actually going to enable you to use more hands-on, traditional filmmaking techniques?
Yes. We’re getting very close to being able to extract somebody from a piece of footage and keep it temporally coherent. Right now, things flicker and we haven’t quite figured out how to get it to be stable. But it is coming, and once you have that – and that goes to the part of me wanting to do AI passes today – I can see a world of shooting actors doing certain things, and using the footage to train a model to extract them from a background, without having to put up the screen. I think that’s getting very close. What that then allows is very much a traditional filmmaking approach. You don’t have to go to the expense of setting up and lighting the screen, and you set up the shot knowing what’s going to change in the background. It’s very much the philosophy of Dune that you try to set up your foreground and your work to give you the best basis for the background – it’s just that we used sand screens rather than AI technology.
We had started to play with an AI towards the end of Dune, and after the movie was out all these new tools came into Nuke. So I had all this imagery with some of the characters with blue eyes, some without, and I trained a model inside Nuke that says ‘when you come across someone’s eyes, make them blue’. Then I fed it some new imagery, and it worked!
It’s fascinating, and it has kind of re-energised the whole visual effects business. The fact that Foundry, the makers of Nuke, have made these technologies available in an artist-friendly form is wonderful.