Virtual Production: Unlocking Creative Vision
WHAT IS VIRTUAL PRODUCTION?
Every director and visual effects (VFX) professional will define virtual production slightly differently, but at its core, virtual production is modern content creation:
“It is an agile process characterized by starting VFX earlier and leveraging technology throughout the entire production life cycle to enhance the way content is created.”
Traditional production is highly linear: Directors and cinematographers plan scenes using storyboards and shot lists; actors are filmed on sets, on locations, and against green screens; and editorial and VFX development starts and finishes after filming is complete.
This one-way procession through pre-production, production, and post-production can encourage negative outcomes, such as a “fix-it-in- post” mentality, destructive or duplicative VFX labor, and expensive reshoots.
“VIRTUAL PRODUCTION IS ITERATIVE AND CREATIVE”
Beginning VFX in pre-production makes digital assets available for planning and shooting, making it easier to continuously refine the final look and feel throughout the course of production.
Virtual production is an expansion of the traditional film-making playbook, enabling studios to pursue greater experimentation while controlling the time and cost of production.
There are four key use cases that fall under the umbrella of virtual production:
Using 3D VFX assets to visualize and plan a task. There are many types of visualization (for example, pitchvis, techvis, and postvis), but the preeminent form is previs: planning a scene in 3D before shooting.
While experienced VFX professionals may say previs has been commoditized, only a handful of VFX studios have developed a reputation for providing visualization services.
MOTION CAPTURE (MOCAP):
Capturing the movements of people to animate VFX assets. Mocap systems have been used in film since the 1980s, but over time, hardware form factors have shrunk, and software has become increasingly automated.
Excellent motion capture is still technically and artistically challenging, but it can be critical for realistic animations of digital humans and creatures.
HYBRID CAMERA (SIMULCAM, GREEN SCREEN HYBRID):
Compositing digital VFX with live-action camera footage in real time. Simulcam was originally developed and coined by Weta Digital and James Cameron for the first Avatar movie in 2009.
It is a direct improvement over shooting against green screens, because visualizing the digital and physical simultaneously with accurate parallax helps directors gain a better spatial understanding of the scene.
Actors also benefit from seeing a preliminary view of the visual effects instead of acting against a green wall.
Replacing shooting against green screens with shooting against LED panels that display final- quality VFX.
LED is a natural progression from the technique of 2D video screen projection, and LEDs’ ability to cast light for accurate reflections is a significant benefit to post-production.
Virtual production isn’t exactly new, and some use cases have already seen significant adoption. However, a combination of industry, tech, and macroscopic developments is accelerating Hollywood’s interest in virtual production:
THE POPULARITY OF VFX-HEAVY GENRES AND RECENT VIRTUAL PRODUCTION BREAKTHROUGHS:
The highest-grossing films of the past decade generally fall into VFX-heavy genres like action/ adventure and sci-fi/fantasy.
Virtual production techniques have been a critical enabler for both feature films and episodic content, as demonstrated by the aforementioned titles.
“The technical difficulty and potential benefits associated with this technique have helped reinvigorate conversations about virtual production.”
However, in late 2019, The Mandalorian was released and widely lauded as the world’s first show to be virtually produced using an LED volume: a semicircular LED wall and ceiling spanning an entire soundstage.
INCREASING ACCESSIBILITY AND JOCKEYING BETWEEN GAME ENGINES AND STUDIOS:
Development in consumer augmented and virtual reality has brought down costs and reduced barriers to experimenting with virtual production.
Game engines provide the real-time rendering that powers virtual production; Epic Games’ Unreal Engine 4 (UE4) was used for The Mandalorian, and Epic has stated that virtually all the functions used for that show will soon be built directly into the engine.
At the same time, leading VFX studios like ILM and MPC are going to market with their proprietary virtual production platforms, StageCraft20 and Genesis, respectively.
COMPETITION BETWEEN STREAMING PLATFORMS AND FILM STUDIOS:
Episodic content and feature films are converging in terms of budget, talent, and VFX quality. In 2019, Netflix spent $15B on original content creation, which is the same amount as ViacomCBS, the parent of Paramount Pictures.
These players are competing not only for customers, but also for content production resources.
COVID-19, PHYSICAL PRODUCTION, AND AVOIDING A “CONTENT DESERT”:
The global pandemic has halted film and episodic production, and this lapse in content generation could lead to a “content desert” further down the road.
Governments around the world are cautiously looking to reopen with new safety measures set in place. Virtual production could be such an opportunity: Virtual sets shift crews from the risks of travel and on-location shooting to the controlled environment of a sound stage. Taking it even a step further, digital humans and motion capture could enable completely remote productions.