4 Reasons the VFX in ‘Kingdom of the Planet of the Apes’ Deserve an Oscar
[Editor’s note: Massive spoilers for “Kingdom of the Planet of the Apes” below.]
Apes rule in “Kingdom of the Planet of the Apes,” kicking off director Wes Ball’s post-Caesar saga 300 years later. Wētā FX leveraged its tech from the previous “Apes” trilogy and the Oscar-winning “Avatar: The Way of Water” for greater photorealism in their performance capture animation and VFX. This is important because the apes are chattier, and there are more CG action set pieces (33 minutes are entirely digital — a franchise first). It adds up to possibly earning the franchise the VFX Oscar that it’s been denied.
More from IndieWire
Would Adding 'Apes' Character Koba Make Other Movies Better? Social Media Seems to Think So
Jerry Seinfeld Loses His Netflix #1 to Brooke Shields in 'Mother of the Bride'
After achieving great on-set facial capture in the rain and snow for director Matt Reeves’ “Dawn” and “War” films, Wētā embraced more expansive environments for “Kingdom” along with more active and expressive apes. In the film, simians have begun developing their own civilization in villages spread throughout the overgrown Pacific Northwest (shot in New South Wales, Australia), with decrepit skyscrapers in the distance. There is more cultural ape identity through face paint and costume, showcased in the natural woven fibers and colorful design of the central Eagle Clan, as well as the use of metals and armor of militant ruler Proximus Caesar (Kevin Durand) and his marauders.
Wētā split their crew into four teams (overseen by production VFX supervisor Erik Winquist), moving through digitally enhanced and fully digital environments and interacting with live-action performers such as Freya Allen’s mysterious human, Mae. Wētā designed and built 11 new high-resolution digital apes and shot facial capture in a variety of locations.
“Kingdom” is led by teenage hunter Noa (Owen Teague), who, unable to complete his rite of passage bonding with eagles, is forced to rescue the rest of his villagers held captive by Proximus. There are also several secondary digital apes, including wise orangutan Raka (Peter Macon), and some scenes required a few hundred apes, such as those set at the encampment of Proximus, which looks like a rusted shipyard-turned military base.
“The thing that I found really appealing about the project, apart from the possibilities for visuals, is what the world looks like that far in the future, and how much of a tonal departure it is from the previous trilogy,” Winquist told IndieWire. “The Caesar trilogy focuses on this character who’s got the weight of the world on his shoulders. He needs to try and free his apes and find a new home. And this is a complete break from that where it’s just this fun adventure tale.”
The VFX supervisor also thinks Ball was perfectly suited to direct “Kingdom,” given his passion for the franchise and his previous introduction to virtual production and performance capture on his shuttered “Mouse Guard,” based on David Petersen’s graphic novel (a casualty of the Disney/Fox merger). “Then he was asked to do an ‘Apes’ film, and he was already prepared in terms of the technology,” added Winquist. “The way we captured it, the way we shot it, it’s all very much the same kind of production approach that we did on the previous ‘Apes’ films. But we were able to leverage a few techniques that we didn’t have on ‘War.'”
New and Improved Ape Performance Capture
For “Kingdom,” Wētā applied its recent tech advancements in new ways, including the use of dual-camera facial rigs to capture the detail and emotion of a performance with greater fidelity than the previous films in the franchise. Additionally, the studio captured and utilized depth data from on set, pairing it with Simul-cam tech from the mocap studio to inform the integration of digital characters when shooting clean plates, while also using data from the location to help the composite of the final shot.
Crucially, Wētā refined tools for muscle simulation, facial animation, grooming, water simulations, painting textures, shaders, and rendering to create more realistic apes. This despite adhering to the same blend-shape facial solver (FACS) instead of converting to the game-changing muscle-based facial system created for “Way of the Water” called APFSA (Anatomically Plausible Facial System). This is more animator-friendly by manipulating the muscles directly on the models to get more nuanced performances. Unfortunately, there was neither the time nor budget to switch systems for “Kingdom,” but Wētā will see greater dividends from APFSA on future “Apes” films.
Still, the results are far superior to “War,” with dramatic improvements in animating the apes, who have evolved significantly in the sequel. They are more upright, have longer legs when riding horses, display greater locomotion, and are far more articulate. Wētā achieved this through, among other things, stereo facial capture and the complex network of muscles from their Deep Learning Facial Solver.
“This is the first ‘Apes’ film where we’ve been able to leverage the technology from Thanos [in ‘Avengers: Infinity War’ and ‘Endgame’] and Will Smith in ‘Gemini Man,'” animation supervisor Paul Story told IndieWire. “It just gives that extra layer of realism. And it gives a little delay to the deeper tissue areas in the face, which helps to break up the A to B feel of blend shapes sometimes.”
Plus, they had to deal with a lot more lip-syncing. Fortunately, the talkative Bad Ape (Zahn) from “War” paved the way. “Paul’s team had to figure out what that looked like,” Winquist added. “They definitely had to explore the ranges of how, comparing how humans talk and then comparing that to how big an ape’s muzzle was, and you’d change those ratios to make it believable on the actual ape. That’s part of our whole mapping when we go from the actor to the ape as we transfer that motion. Andy Serkis [Caesar] generously gave his time to give the actors some pointers on their movement over Zoom. But also that process of talking through what worked for him with Caesar in terms of where his voice was coming from down deep in his diaphragm.”
The Siege of Noa’s Village
When Noa returns to his village at night, he finds it under attack by Proximus Caesar’s marauders. It was shot by Ball’s go-to cinematographer, Gyula Pados, mostly as a hand-held one-shot, where we follow Noa staggering around his village in shock, surrounded by flames, trying to elude the marauders.
“That involved a lot of time,” Winquist said. “Wes and Gyula had been kicking ideas around about wanting to do a big, ‘Saving Private Ryan’-esque character stumbling through action sequence. So Gyula had actually prevised that in Unreal Engine over a weekend. That became the blueprint for what we started talking about with the stunt team and how this might work on our location, which was a private location, to hit the beats.
“He’s going to get hit by a horse going that way,” Winquist continued. “They walk around, he stumbles this way. Another horse comes this way. Another horse goes that way. Working out all of these performance beats and then figuring out where the camera needed to be at any given point. So we ran through rehearsals four or five times a day at that location, which was a private property. And we waited for the light to get it right, when we did the whole thing again, clean.”
By this point, the VFX supervisor was trying to get muscle memory for camera operator Ryan Weisen, and then the special effects team triggered flame bars so Weisen could run the continous Steadicam move that was hitting all the beats, but in a completely empty environment. “That gave us our clean plate,” added Winquist. “And so our department built for us one story’s worth of the first tower and then that was going to be a digital extension, and then the back tower that we saw on fire is completely digital. But it gave us all the rest of the environment that was there. And it was real for as much as we could do, which freed us up to digitally augment the rest in post.”
Recreating ‘The Hunt’
There are many homages to the original “Planet of the Apes” (1968), but the recreation of “The Hunt” provides the most nostalgic thrill (including a shout-out to Jerry Goldsmith’s eerie theme by composer John Paesano). In the original, Charlton Heston and his fellow astronauts are hunted and captured in a jungle by talking gorillas on horseback. In the reworking, marauding apes hunt and kill mute humans, with Mae resourcefully fleeing capture in the tall grass and Noa coming to her rescue on horseback.
The scene on horseback was shot across three different locations. In some places, that meant putting CG trees off in the distance and adding CG grass due to the interaction and for continuity. “It was also challenging to pull off, partly because we were doing on set performance capture, but the further we would spread ourselves out in a space, the harder that becomes. So it was needing to lean on different techniques for the capture process.
“When we’re in our mocap stage, in a nice controlled stage, we’re using our standard mocaps performance capture with passive markers on the suits,” Winquist added. “When we’re doing stuff outdoors, we need to combat the sunlight, so now we’re using active markers, where they’re actually emitting infrared light that is picked up by the cameras. But when we’re on the move, it takes time to set up a mocap volume outside and calibrate it and maintain it. That’s when we use a third technique called FOCAP, where actors wear black-and-white checkered bands on their suits as well. And so that’s just using witness cameras and piecing those together. We needed to lean on that technique a lot because we were running down rivers and riding horses down rivers and riding horses through big fields.”
The Rushing Water Sequence
Another big challenge for Wētā was water simulation (a river sequence alone required 1.2 petabytes of disk space). However they needed immense rushing water to flood a weapons silo for an exciting action sequence. Fortunately, they were able to make use of the multiphysics simulation workflow called Loki developed for “Way of the Water.”
“Knowing what we had been working on in terms of simulation and technology and improvements for that film, it was clear that there was at least a pathway forward for what we needed to do for this movie,” Winquist said. But it was a godsend that the water wasn’t crystal blue and that they didn’t have to do underwater performance capture.
“It’s all very dirty, churned up, foamy, frothy, angry,” continued Winquist. “That required the same toolset but a different focus. But the nice little fleeting glimpses — it’s chaos. And so, it meant that we didn’t have any nuanced, real performance there that we had to capture. But we definitely did some stuff for our hero performances when they’re floating and bobbing at the surface.
“We just had [10-pound sandbags] on the actors’ legs to simulate wading through water and dragging them around office chairs to at least give them some way to sort of struggle against something,” Winquist added. “This worked out so well as they’re chasing each other, and pushing off through obstacles to keep from getting to Noa.”
Best of IndieWire
The 13 Best Thrillers Streaming on Netflix in May, from 'Fair Play' to 'Emily the Criminal'
The Best Father and Son Films: 'The Tree of Life,' 'The Lion King,' and More
The 10 Best Teen Rebellion Films: 'Pump Up the Volume,' 'Heathers,' and More
Sign up for Indiewire's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.