DNEG VFX Supervisor Huw J Evans on creating digital creatures, developing a brand new pipeline for working with Unreal Engine, and unearthing assets from the original films
After much ambivalence and almost two decades since the conclusion of the original trilogy, Lana Wachowski has revived the world of the Matrix, bringing DNEG on board as lead VFX vendor.
DNEG created over 700 shots for the film including enormous environment builds for the fetus fields and the IO Mega City, as well as lots of CG characters – synthients, sentinels and the shape-shifting Exo-Morpheus. The project also involved creating an entire sequence – the dojo fight scene – in Unreal Engine, something that hadn’t been done before at DNEG and required the development of a whole new pipeline.
One of the main technical challenges of the project was the sheer size and scale of the immensely detailed environments. Work on the IO Mega City was led by environment supervisor Ben Cowell-Thomas and lighting supervisor Simone Vassallo, who were joined by DFX supervisor Steve Newbold and environment supervisor Nigel Wagner for the work on the fetus fields. The IO city had to be built with enough detail to support a full fly-through sequence, with everything rendered at 4K. The fetus fields were similarly taxing: the field itself contained 6 million pod stalks and the background towers close to 20,000 parts per tower.






The film takes place 60 years after the end of the trilogy, so the environments have been updated to reflect the developments of that time period but still retain the overall aesthetic of the original films. The DNEG team only received concept art that provided a broad overview of how the environments should look, so the detail work was theirs to figure out.
“Lana wanted the IO Mega City to feel like a mixture of being human-built, but with the influence of machines,” says Huw. “So we started looking at these little vignettes of action. We looked at things like farm sections, factory sections, residential block sections, all crammed into this sprawling cave environment. For the buildings we looked at brutalist-style base buildings, but then mixed that with some intricate 3D-printed-style architecture. We didn’t want it to feel too smoggy and dirty down there – it’s supposed to be clean and fresh – so we looked at clean energy factories and tiered farming plots. In this film they have a bio-sky, so they’re farming proper food rather than eating the slop that they used to eat in the old matrix. That helped with working out a logic for IO. So we’ve got tiered farming zones connected with water irrigation, walkways, marketplaces, and an endless array of tiny details to make a convincing city.”
An exciting moment of nostalgia came for the team when they managed to unearth 20-year-old assets from the original films. “It was difficult to find them first of all, and then a challenge to find a version of Maya that was old enough to open them,” he says. They found some old Sentinel assets. “Obviously we couldn’t use them directly, because times have changed, topologies and detail levels have changed. But it was a great reference for us to get. We got the base shape sorted, and then we were careful about what we updated. Lana was adamant that the sentinels are the cockroaches of the Matrix world and they don’t really change. We did add some extra cables and wires because we had a lot of close up shots and we wanted to make sure that the detail held up. In terms of texture and shading, we just bashed them up a little bit more – they’re a bit more used and worn.”
The Unreal dojo fight scene
For the dojo fight scene created in Unreal Engine, CG supervisor Roel Couke was charged with communicating with the Epic tech team, Ben Cowell-Thomas was the environment supervisor and DFX supervisor Robin Beard managed and oversaw the sequence.
Initially, Lana worked with Epic to create an environment based around Rakotzbrücke Devil’s Bridge in Germany, a semi-circular bridge whose reflection completes the circle when viewed at the right angle.







“Our plan was to push real time rendering in Unreal to try and get final quality renders that would hold up at 4K,” says Huw. Building the Unreal pipeline was a challenge. “We got a bleeding edge release from Epic that hadn’t gone out to the public yet – we had to take the source code from them and build it at our end.” he explains. “We were having this back and forth dialogue with Epic to explain what tech we needed to be able to use the images in our pipeline – things like OCIO colour support, separating render passes out. All of that stuff you can do now, but at that time it was all new.”
The DNEG team received the scene from Epic and adjusted and progressed it, adding details such as falling leaves and ripples on the water surface, and lighting the shots. As it was all material from Unreal, the lighting work could be done by a single artist with some support as opposed to a team of people – even though there were over 100 shots in the sequence. “We could block out the cut quite quickly and figure out things like lighting direction and how it was going to look through camera. So it was quite a big thing for us,” says Huw.
Overall, working with a new tool required a time investment, but produced some substantial benefits. “If we were to do it again, having already set up for the painful techie issues, it would be a lot smoother,” he says. “Having the ability to block out a whole sequence and view it at a decent quality, a proper rendered sequence, was hugely helpful. And being able to get a really good judgment of what the lighting should look like – really quickly – was super helpful as well.”
On the downside, there was some difficulty at first in achieving the quality level of the traditional techniques. “We have setups for doing water and ripples, we know how to make that look good. Having to recreate that in Unreal with a different toolset was something we struggled with initially and had to figure out, but if we were to do it again, a lot of those issues would be ironed out.”
Exo-Morpheus
One of the most challenging parts of the project, both technically and artistically, was creating the physical manifestation of Morpheus’ digital self, Exo-Morpheus – a character composed of what look like animated ball bearings. Creature supervisors Erica Vigilante and Mark Ardington, and FX supervisors Mike Nixon, Tamar Chatterjee, Tom Bolt ran this part of the work.
“His complexity started in the concept stage,” says Huw. “Lana had originally envisaged him to be an abstract and elegant character, very fluid, loose and uncontained. But he eventually settled down into more of a humanoid form because he was a leading character and he had dialogue to do. It was really important for the audience to be able to relate to him and understand him visually.”






Multiple witness cameras were used on set to capture the performance, and the actor wore a faux tracking suit so that the body tracking team could get a reference for contrast and shapes and capture as much detail as possible. A head mounted camera was used to track his face.
Back at the office, they started out by treating him as if he were a standard digital character, doing body tracks, muscle simulation in Ziva, and skin simulation. These stages were important because the ball bearings would need to depict his muscles and skin so that his body feels real when he is solidified. “We wanted to ensure the balls aren’t just sliding around arbitrarily,” says Huw.
The face data was handled by the animation team, who sometimes augmented the captured performance. “We knew that his face was not going to be as high fidelity as a straight up human would be because the balls do take away a certain amount of the resolution. So sometimes we had to push the expressions a little bit so they could be read through the balls.”
Sometimes they were unable to capture a facial performance on set because the camera rig was blocking another actor’s face and had to be removed, and in these instances the animation team created their own performance based on the data they had.
This material was passed to the effects team, who worked in Houdini. One of the first challenges was figuring out the right size for the particles. “We initially kept them at a consistent size, because that was a story point. But very quickly we lost all the nuances from his face. So we ended up doing multiple sized balls to get into the creases and around the eyes and the mouth in particular, so we would get that resolution without deviating too much from the original concept.”
Multiple setups in Houdini were required to cover the wide range of different actions the character would perform. “We made some custom setups in Houdini so we could try to combine a mixture of procedural work and fully simulated elements. We wanted to make sure that we could art direct the performance, but still have the natural-feeling behavior that Lana was after. In some shots he stood fairly still so the simulations would be calmer and a little bit more flowing. There was this reference of seagrass that Lana liked, that kind of slow, controlled movement that still had a bit of freeness about it. That was our base. But obviously, we’ve got shots with him running and jumping and doing more actions, so things would have to be adapted accordingly. We couldn’t just have one setup.”
Lighting Exo-Morpheus was tricky, because the hundreds of small reflective moving balls were generating a lot of visual noise. “As soon as you add a few point light sources to the scene, they’re all going to be reflecting the same thing and it starts to get a bit visually messy. So we couldn’t just rely on HDRIs that were captured on set, we had to creatively adjust quite a lot of our shots in order to get him looking like he’s sat in the scene, but also not visually distracting. It was a fine balancing act,” says Huw. The lighting team used a physical standing head covered in ball bearings on the set as a lighting reference. It tended to have a disco-ball effect, which had to be toned down.
It was important to have a presence for Exo-Morpheus on set so that eyelines and interactions with other characters would be captured correctly, but this generated plenty of work for the prep team. Since there are gaps between the ball bearings, a view of what was behind him was needed at all times – so he had to be removed from every shot.
Further images – click to enlarge













