Feature

The Future of Digital De-Aging Has VFX Artists “Excited” and “Terrified” All at Once

Inverse speaks to two experts about the challenges (and future) of digital de-aging.

by Logan Plant
Lais Borges/Inverse; Walt Disney Pictures; Getty
Sponsored
The Summer Blockbuster Issue

It’s easy enough to spot the CGI when Harrison Ford shows up in a movie and the 80-year-old actor suddenly looks like a spry 45 — a very real thing that happens in the upcoming Indiana Jones and the Dial of Destiny — but “digital de-aging” has seeped surprisingly deep into the bedrock of Hollywood.

“There are many instances of de-aging on television shows that most aren’t even aware of,” VFX artist Yoshi Vu tells Inverse. “Not for the sake of flashbacks but simply to make the actor look younger or more attractive, the same way makeup is used.”

Then again, sometimes it’s very noticeable. Decades after we first saw characters like Indiana Jones and Luke Skywalker on the big screen, their original actors are still portraying them in sequels and spin-offs. And in some cases, Harrison Ford and Mark Hamill appear as they did back in the days of the originals.

To understand the art of digital de-aging, and how it’s colliding with unprecedented issues facing the special effects artists today, Inverse interviewed two experts about the process, the challenges, and how rapidly advancing technology could totally overhaul the industry.

How Digital De-aging Works

A digitally de-aged Mark Hamill in The Book of Boba Fett.

Lucasfilm

“People believe it’s simply a button-click process,” says Vu, who’s worked in VFX since 2010. (His credits include The Walking Dead, League of Legends Odyssey, and several Marvel and Star Wars projects.)

That couldn’t be farther from the truth. “While it is true that software technology makes the job easier with each advancement, it’s still down to artistic talent and skill,” he says.

De-aging doesn’t happen in a vacuum, either. While the process varies from movie to movie and actor to actor, one thing is generally true: It starts with a conversation between the VFX artists and the filmmakers.

“We first work with production to zero in on a target age,” says Trent Claus, a supervisor at Lola VFX with over 16 years of experience in digital age manipulation (including credits on The Curious Case of Benjamin Button, Captain America: The First Avenger, and several more Marvel movies).

After meeting with filmmakers to set de-aging goals, Claus’ team uses old footage of the actor as a reference. They’ll run multiple tests to see what age looks best, with factors from the movie like movement, lighting, and the environment playing a crucial role in what they can get away with. A character making rapid movements or a scene set in stormy weather creates more work for the visual effects team, for example. The more that’s going on in a scene, in general, the harder it is to make a de-aged actor look natural and real.

Samuel L. Jackson in Captain Marvel.

Marvel Studios

The desired age also factors in heavily.

“On the artistic side, certain age ranges are harder than others,” Claus says. “Trying to take someone from 25 to 15 is much harder than trying to take someone from 35 to 25.”

Then again, some actors simply have better faces for de-aging than others.

“People age very differently from one another,” he adds. “So some people are simply harder than others.”

“We want to make sure the work we do never gets in the way of the performance and the storytelling.”

Each shot presents a different challenge, often requiring a mix of computer-generated face replacement, 2D projection tricks, digital frame-by-frame painting, and deepfake technology. However, it turns out the most difficult thing to de-age are eyes.

“Eyes are usually the toughest,” Vu says. “I’ve generally seen the process done in a way that tends to keep the actor’s eyes in the final composition when possible or as an option to fall back on.”

But above all else, the special effects need to serve the movie itself.

“We want to make sure the work we do never gets in the way of the performance and the storytelling,” Claus says.

VFX Crunch and Avoiding the “Uncanny Valley”

Al Pacino in The Irishman (2019)

Netflix

In 2022, as Marvel’s cinematic assembly line began to show the first sign of wear and tear (including some subpar CGI), VFX artists started speaking out about the brutal working conditions at many special effects studios stretched too thin and pushed too hard. Tales of crunch, overtime, and unrealistic expectations from Marvel, in particular, have led to MCU projects that don't look as visually impressive as fans expect. (Not to mention what’s happening at the studio’s biggest competitor.)

With de-aging, the final product has to be essentially perfect for audiences to buy in. As Vu puts it, if something is even “one percent off here, and two percent off there” it creates an “uncanny valley” effect that can pull viewers out of the experience.

“The amount we have to squeeze out of our brains like a sponge can be draining at times.”

Both Claus and Vu believe the biggest hurdle facing teams working on digital de-aging is the same as the problems presented to the rest of the VFX industry: planning, time, and cooperation.

"Without proper planning, or if deadlines are unrealistic, it can be very difficult sometimes," Claus says. "But our artists can achieve almost anything if given the proper support."

Vu agrees: “The bar keeps getting raised, but the resources and time doesn't seem to scale along with it. The amount we have to squeeze out of our brains like a sponge can be draining at times. Staring at one set of wrinkles by a mouth, or the eyebrows protrude from the skin, can be taxing when done for extended periods of time.”

Sean Young as Rachael in Blade Runner 2049.

Warner Bros. Pictures

And then there’s the rise of artificial intelligence. While the rapidly evolving technology is a concerning development in a number of creative fields, Vu and Claus both believe more advanced AI could significantly improve the digital de-aging process.

Claus thinks AI could be used to fill in the blanks when artists don’t have enough reference material for what a younger version of an actor looks like. “It’s likely we’ll see strides in the AI’s ability to guess when we aren’t able to provide enough data,” he says. But even then, it will still take a human artist to make the final product look good.

“I'm not sure if I'm excited or terrified by that, but I'm definitely curious.”

Vu isn’t so sure there are any limits to the power of artificial intelligence in the long run. Noting that “deepfake” technology has already given him unprecedented new tools, he argues that as AI evolves, its potential for VFX could approach the realm of science fiction.

"With a lot of the newer advances, digital tools are becoming simpler and more straightforward,” he says. “I personally think that filmmaking in the future won't be too far off from how [Star Trek's] holodeck works.”

“I'm not sure if I'm excited or terrified by that, but I'm definitely curious."

Related Tags