Avatar: The Way Of Water – how gaming tech helped bring Oscars nominee to life, and could change filmmaking forever | Science & Tech News

From the Indiana Jones-esque adventures of Lara Croft to the increasingly Pixar-quality cartoon visuals of Super Mario, video games have long looked to Hollywood for inspiration.

But recent years have showed that the relationship is becoming increasingly transactional.

While you don’t have to look far these days for a film or series based on a popular video game (The Last Of Us and Sonic The Hedgehog are just two, with Mario himself in cinemas soon), it goes much deeper than you might think.

“These worlds have been converging for a decade now,” says Allan Poore, a senior vice president at Unity, a video game development platform increasingly turning its hand to films.

“And for the most part, the core principles are actually the same.”

Indeed, modern video games look so good that the technology behind them is quite literally changing the way blockbusters are made – including the very biggest of them all.

Avatar: The Way Of Water was comfortably the highest-grossing film of 2022 – fitting, given it’s the sequel to the highest-grossing film ever made.

James Cameron’s latest blockbuster is up for best picture at Sunday’s Academy Awards – and success in technical categories like visual effects seems all but assured.

Avatar: The Way Of Water. Pic: 20th Century Studios
Avatar: The Way Of Water. Pic: 20th Century Studios
Image:
Avatar is up for best picture and numerous technical awards. Pics: 20th Century Studios

The tech behind Avatar

Many of the tools used to bring The Way Of Water to life came from Unity’s Weta Digital division.

Unity bought the tech assets of Weta, the New Zealand-based visual effects firm founded by Lord Of The Rings director Peter Jackson, for some $1.6bn in 2021 (he still owns a now separate company called WetaFX, a more traditional visual effects company that – somewhat confusingly – also worked on Avatar).

But what Unity’s deal did was bring a team of talented engineers used to working on films under the umbrella of a company best known for its accessible video game engine. Think of a gaming engine like a recipe kit – it will contain everything you need to make a game. Some are designed to help build specific types of games – like a shooter or sports title, while others are more broad-brush.

Unity has been used on everything from indie titles to entries in the Call Of Duty and Pokemon franchises.

Jackson said the fusion of expertise, known as Weta Digital, would be “game-changing” for creators.

What makes video games tick is that the rendering of the worlds players explore is done in real time. That’s because a game can play out differently depending on what the player does – it’s not fixed like a film or TV. Just think of that scene in The Wrong Trousers where Gromit is building the train track as he moves along it and you will get the idea.

That’s hugely different to how films have traditionally handled visual effects, where the rendering all happens during post-production – it’s why you’ll see behind-the-scenes footage of actors standing in big green rooms, or talking to tennis balls on the ends of sticks. All the computer wizardry was done after the fact.

Avatar: The Way Of Water. Pic: 20th Century Studios
Image:
James Cameron on the set of Avatar, which featured underwater motion capture. Pics: 20th Century Studios
Avatar: The Way Of Water. Pic: 20th Century Studios

‘How do you speed up filmmaking?’

And while The Way Of Water still leaned heavily on those techniques, parts of the production were powered by new real-time techniques that let Cameron and his cast and crew paint a picture of the finished product as they were working on set.

“How do you speed up film making? You do it by showing artists and directors, as quickly as you possibly can, a representation of what that frame is going to look like,” says Poore, who worked on hit animated films Ratatouille, Incredibles 2, and Coco during his time at Pixar.

“Directors will use a screen that is actually showing real-time components, so they can see what the scene and surroundings will look like as they film.

“Hopefully they’re going to help make film production smoother, easier, and faster.”

With Avatar 3 less than two years away, rather than another 13-year gap as seen between the first two films, that assessment may well be correct.

A galaxy far, far awayโ€ฆ

Unity’s rivals have also looked to take advantage of just how photorealistic real-time visuals have become to make moves into filmmaking, in some cases taking things even further.

The Mandalorian, the hit Star Wars series that returned for its third series this month, uses an immersive soundstage called The Volume to put its actors into whatever fantastical scenarios its writers can dream up.

Read more:
This is the new Oscars Crisis team
What 94 years of winners tells us about the Oscars

Disney+'s The Mandalorian. Pic: Disney+
Image:
Obi-Wan Kenobi and The Mandalorian used gaming tech for their virtual sets. Pic: Lucasfilm

Rather than rely solely on green screens that see the effects added during post-production, The Volume boasts an enormous wall of screens that show digital environments made using Epic’s Unreal game engine (which powers the popular shooter Fortnite) in real time.

It means the actors know where their characters are supposed to be, and changes can be made on the fly.

Two recent comic book films have also used it – last year’s The Batman and last month’s Ant-Man threequel.

Click to subscribe to Backstage wherever you get your podcasts

Star Wars actor Ewan McGregor worked on The Volume during his return to the franchise last year, and hailed its transformative impact compared to the films he worked on 20 years ago.

“It was so much blue screen and green screen, and it’s just very hard to make something believable when there’s nothing there,” he said. “And here we were [on Obi-Wan Kenobi] in this amazing set where if you’re shooting in the desert, everywhere you look is the desert, and if you’re flying through space, the stars are flying past you. So cool.”

Read more:
Inside the big Oscars preview party
What it’s like to get an Oscar nomination

Hello there! Ewan McGregor reprises his role as the titular Obi-Wan Kenobi. Pic: Lucasfilm
Image:
Ewan McGregor was complimentary about the difference the tech made to filming. Pic: Lucasfilm

‘It’s a huge change’

While Poore doesn’t see the need for traditional digital effects techniques evaporating any time soon, the idea of a “virtual production space” where visuals can be generated on the fly is only going to grow.

At the UK’s National Film and Television School, there’s already an entire course dedicated to just that.

Ian Murphy, head of the school’s visual effects MA, says: “The main change that’s really exciting is it takes what was post-production, firmly at the end of the process, and gets us involved right at the beginning.

“VFX people are quite techy, but this pushes them into having conversations with production designers and cinematographers on set – and that’s a huge change.

Students at the National Film and Television School go hands-on with virtual production. Pic: NFTS
Students at the National Film and Television School go hands-on with virtual production
Image:
Students at the National Film and Television School go hands-on with Volume-style virtual production. Pics: NFTS

“If you’re shooting on green screen, you’re having quite odd, nebulous conversations. The idea of this tech is the changes are fairly instant. And they might not be the finished pictures, there’s still visual effects work to do, but something from that process is sort of a blueprint that takes you into full production.

“And with the images you get from a game engine nowโ€ฆ the trajectory is certainly all moving towards it eventually being the actual images people see in the cinema.”

We’ve certainly come a long way from Pong.

You can watch the Academy Awards on Sunday 12 March in the UK from 11pm exclusively on Sky News and Sky Showcase. Plus, get all the intel from our Oscars special Backstage podcast, available wherever you get your podcasts, from Monday morning.