Too Much Volume? The Tech Behind ‘Mandalorian And ‘House Of The Dragon Faces Growing Pains
Ever since the Disney+ series The Mandalorian began making headlines with its ingenious use of LED stages and virtual production lines to capture and combine effects in real time, it has also been on its way to earning Emmy Award nominations. Visual Effects Awards and cinematography in 2022. in Engineering, awarded to Industrial Light & Magic for systems development; this type of scene has probably been the fastest growing area of visual effects and production technology. But watchers warn that with the explosive growth of the LED scene, the business, technology and creative models for these costly installations need to be better understood before their full potential can be realized.
“We watched about 300 scenes compared to three in 2019,” Miles Perkins, director of film and television at Epic Games, creator of the Unreal Engine used in the virtual production line, said of increased productivity through investment in studios, stage complexes . and stage sets. . : VFX company.
More from The Hollywood Reporter
This facility includes the ILM StageCraft volume at Manhattan Beach Studios used in Lucasfilm's upcoming The Mandalorian and Ahsoka films. Marvel in Sydney is using a bespoke Thor. Love and Thunder for StageCraft and relies heavily on the PSA system at Pinewood Studios in London for Ant-Man and the Wasp. For Quantum. Pixomondo's LED stage in Toronto is on long-term lease from CBS and has been used for series such as Star Trek . Metropolis of Ford Coppola. production ranged from Sony's actions last summer to Warner Bros. films. Shazam: Fury of the Gods will release in March.
However, while many stakeholders are touting the promise of these virtual production sites, others fear that there are too many of them on the market today, especially with the increasing demand for traditional soundstages. “It was a huge learning curve and I think it worked out really well at times and at other times it got worse,” said HBO ’s House of the Dragon director Claire Kilner. "It's great for endless sunrises and sunsets, for example." However, the virtual production site of Warner Bros. The London-based Leavesden studio, which opened in 2021 and served for Dragon , has closed, The Hollywood Reporter has learned.
“Due to high demand for studio production space, Warner Bros. virtual scene. Leavesden Studios will return to the traditional soundstage to give our customers more flexibility," a Warners spokesperson said, "Producers will continue to use virtual production technologies as needed."
Oscar-winning visual effects specialist Ben Grossman, whose company Magnopus was instrumental in developing the virtual production process for Jon Favreau's The Jungle Book and The Lion King , estimated that the cost of building an LED stage for a virtual production would be between $3 and $30. . million dollars. million. It depends on the size of the LED wall and design features - air conditioning and power supply, wall weight support, additional lighting and camera tracking technology, etc. - which is required to update the scene for this use. Also, the content being projected onto the filming LED wall can be the most expensive part of the process; These complex, fully computerized environments can require a visual effects artist to work for four to six months. “You can start spending a lot of money on content very quickly,” notes Grossman.
As with any new technology, some are the first to embrace it, while others are wary of it. "That's not the point," said director-producer Jay Holben, who recently passed a short test (known as Standard Grade 2) for the Filmmakers Guild of America. directorial scene. He sees great potential in this process, but notes that various elements, including color and lighting, still need to be worked out. “Existing LEDs do not have a color spectrum,” he gave an example. "It's changing, but it's not on the market yet."
“A lot of people come here thinking they can turn on the camera and take a picture,” Holben said. "But if the lighting isn't right and the color balance isn't adjusted properly... these things can look bad."
“I think there is an oversupply of LED flooring right now and everyone wants it, but they don’t feel comfortable using it yet,” Grossman said. “In the long term, I think the industry will bridge the gap between understanding budgeting and considering increasing utilization.”
One source, who wished to remain anonymous, was blunt. “Some [shows] were very successful. others bleed because people are not ready.”
For early adopters, LED stages can help control production costs, schedules, and tour complexity while creating a sandbox for creative experimentation. But overusing technology can have its downsides, and understanding how and when to use them is key.
The Lucasfilm series The Book of Boba Fett and Obi-Wan Kenobi , both of which make extensive use of 3D technology, have been criticized by parts of the fan community for relying too heavily on technology, unlike the recent show. Andor . mostly filmed on location. .
Holben's notes: “Being close and surrounded is always very valuable. There is nothing better than being in a real canyon or going to Ireland and being in a real castle. When you are in business, there are many discoveries waiting for you.”
Cinematographer Greg Fraser, who won this year's Oscar for Denis Villeneuve's Dune and used LED scenes in the first season of The Mandalorian and parts of Batman , said the virtual scene "doesn't play very well at noon or in the afternoon sun." ", but refers to the sun. sunrise and sunset.. Scenes from Batman as an example of the usefulness of this movement.
From a creative standpoint, he elaborates, "If you're doing something at dawn or dusk, like we did in Batman on a construction site overlooking Gotham , it works really well because you're dealing with sweet lightness. Especially for Batman , it's great because it's a long scene, and usually if you want to shoot something at dawn or dusk, you have very little time.
“There is a tendency to think [LED] volume solves all the logistical problems associated with filming on location,” Frazier added. “The danger when people don't really understand what's healthy and what's not is that they tend to inflate things that shouldn't be. And when you watch it, it's just not right, which can make the virtual shoot look bad."
The source indicated that a virtual production should start with initial planning, taking into account factors such as time, budget and creativity, and involving multiple departments, including the virtual art department. “We did a cost analysis,” said Janet Levin, senior vice president of visual effects at Lucasfilm and general manager at ILM. “It’s a difficult metric for people to develop on their own.”
To understand and use this technique, Epic Games has created what it calls the Unreal Fellowship, a 30-day virtual production course that has trained almost 2,000 professionals since its launch nearly 2 years ago. Participants apply for places, and those who are accepted are paid $10,000 by Epic to participate in the course. Epic's educational efforts also include partnerships with the Motion Picture Society of America and the Art Directors Guild.
While many use the term "virtual production" as a synonym for LED stage, the former has a broader meaning that can include areas such as pre-rendering and performance capture. Notable moves include Favreau's production of The Lion King , which allowed filmmakers to explore and experience African computer-generated locations while wearing virtual reality goggles. Many attribute the beginning of what is now considered a virtual production to the creation of Avatar by James Cameron in 2009.
In an effort to help filmmakers use a common vocabulary, the Visual Effects Society launched an online virtual production glossary earlier this year. Here, virtual manufacturing is defined as a method that “uses technology to connect the digital world with the physical world in real time. This allows filmmakers to interact with digital processes in the same way they interact with live productions.
This definition seems to paint a picture of what the future of entertainment will look like. “We see virtual production as a bridge that helps create film and TV content in a way more suited to a metaversion or immersive experience, as well as engage the audience in that experience,” said Magnopus’ Grossmann, noting, “If we can put a cinematographer and team in a set surrounded by a wall of LEDs, so we can take content and show it to viewers in their homes through VR headsets.”
Added Perkins Epic Games. “Once a team has iterated and finished an object, they can use it in a variety of environments: linear content, experiential content, games, live events, etc. With a real-time game engine like Unreal, everything is in place. that there is no longer a difference between linear and empirical performance requirements. This means that virtual production is essentially preparing us for a new era of entertainment."
James Hibberd and Alex Rittman provided reporting.
This version of the story first appeared in the October 19 issue of The Hollywood Reporter. Click here to subscribe .
For more stories like this, follow us on MSN by clicking the button at the top of this page.Click here to read the full article.
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home