laitimes

The metaverse builds a whole new way of making movies

author:Old yuppie

Source Theconversation by Darren Paul Fisher

Film production has been hit hard by the COVID-19 pandemic, with delays in screenings and production interruptions or cancellations.

How can metaversity help us produce our favorite big-screen content during the COVID-19 pandemic?

What metaverse tools can do now

Traditional production requires that actors and crew must work in the same location at the same time. The past two years have shown that there is a great need for a filmmaking method that allows actors/crews to shoot in different locations, or a film where part or all of the space of the production is presented in a virtual space (such as the Remake of The Lion King).

Current tools that can use disappointment metaverse technology include technologies such as deepfakes that seamlessly stitch anyone in the world into videos or photos, as well as production computer programs that create locations and avatars such as Unreal Engine.

Disney Studio The Volume, the creator of The Mandalorian, uses this latest technology to achieve great results. In Mandalorian, high-definition digital screens are mounted on walls and rooftops to provide a background, perfect viewing angle and light, and finally the film is a mixture of live-action shots and entirely computer-generated images.

It is important to note that the first thing to consider when addressing two of the world's most pressing production issues, which require and coexist with COVID-19 for a long time, is how to deploy these technologies.

The metaverse builds a whole new way of making movies

The Mandalorians (Pedro Pascal) and children in Season 2.

Problem one: The director is in one place, and the actors and crew are in another

If this is a remake of The Lion King, director John Favro can remotely access the virtual environment from his home media room using his VR device. For other shooting and production sessions, the director can interact seamlessly with the actors through AR glasses.

In this way, the functionality of a multifunctional media room equipped with a range of cameras and displays is developed. This is already happening, and it's something that big tech companies are looking for to accelerate and break through. Products such as Microsoft's Mesh for Teams have been introduced, where mixed reality allows for 3D holographic interactions for meetings and collaboration.

Problem two: The director, the lead actor, and the co-star are all in different places

Relying on metaverstem technology, we can:

(a) Shoot each actor separately in front of the green screen with a different crew and then match the background (but the actors will not interact).

(b) Use AR glasses to allow actors to see each other playing, and then digitally remove the excess.

(c) Use actor stunts and use deepfake techniques to put the star's face on it. If the actor needs body touch, this scheme is preferred.

However, they all have drawbacks — or, in fact, a shortcoming. Performance problems with actors.

Unless we can refine both the realism and performance of the characters, as a true alternative to screen production, the metaverse will never reach its full potential.

The young Luke in the movie Skywalker was created by a combination of stuntman and deepfake technology. It looks perfect physically, but not when "Luke" starts talking. This makes most of the dialogue have to take place off-camera. There is also a strong uncanny valley benefit in the performance.

The metaverse builds a whole new way of making movies

Mark Hamil, who plays Luke in Disney's version of The Book of BobaFeit.

Future-proof

The day of the birth of the perfect human incarnation may soon come. Novelist/futurist Michael Crichton foresaw this— not in Westworld or Jurassic Park, but in his 1981 obscure film Looker. The story involves scanning and animating actors' techniques that allow them to retire and simply manage their image rights.

The metaverse builds a whole new way of making movies

On the day of true arrival, all movies can be made in a virtual environment like the Lion King.

Actors will remotely control their avatars from their media rooms. In the future, stars can have two kinds of starring prices: one is his real-life appearance; the other is his digital avatar with programmed acting skills.

Technology can do it, but do we want to use it?

History tells us that new technologies are usually not used in large quantities, and that old technologies never completely die out. Many project parties will make full use of the metaverse, allowing them to continue filming in real-world disasters.

Maybe a whole new hybrid genre will emerge. Movies that used to be animated can now have a photorealistic feel — call them "live animations."

But in the future, cheap fast food may only use lab-grown simulated meat, and only top restaurants will use real poultry. The same can be true of screen production: eventually, live-action footage will become an old-school classic.

Read on