Virtual Production: Real Time, Right Now [The Graphic Masters Series]
11 March 2016
As technology advances to meet the needs of bigger and more adventurous VFX blockbusters, the movie industry has found itself in an age of virtual production. New real-time film-making techniques enable artists to design and explore complete virtual environments in real time. Here we look at some of the most recent developments in virtual production tools.
So what is virtual production?
For many people, the birthplace of modern virtual production was Avatar in 2009. Although motion capture was already widely used in visual effects work, the real-time retargeted previews of mocap data used on Avatar enabled director James Cameron to view his CG characters performing against virtual sets live on the motion-capture stage. The animation was then refined by VFX house Weta Digital, and the characters rendered for final output using Weta’s farm of 2,176 HP ProLiant BL2×220c blade servers.
The term ‘virtual production’ is sometimes used to refer to motion capture and real-time rendering. But more generally, it describes the erosion of the boundaries between pre-production and final effects, enabling artistic decisions to be made earlier in the film-making process. In either case, a key aim of such workflows is to enable artists to interact with digital assets in real time. Creative decisions can be made quickly, and changes saved for use further down the pipeline.
Tomorrow's tools, today
Lightstorm Entertainment, Cameron’s production company, developed much of its own technology for Avatar. But in 2012, the firm announced a partnership with Autodesk and Weta to develop a new generation of off-the-shelf virtual production tools – tools that are being rolled out in the tools of Autodesk’s Entertainment Creation Suite.
“Autodesk provides software and services to both the film and games markets.” says Autodesk senior director of industry relations and business development David Morin. “So it makes sense for us to partner with the companies leading the charge in this area. This helps develop the production expertise of our service group and push the development of software like Maya and MotionBuilder for our customers.”
“Virtual production is at the leading edge of innovation in the motion picture industry. It is a touchpoint between linear and interactive media.” David Morin – Autodesk
Used on the set of Avatar to visualise motion-capture data in real time, MotionBuilder enables artists to set up detailed interactive previews of virtual performances. Its support for HD SDI video output gives production staff and camera operators more accurate, real-time feedback on virtual camera work. So directors can record multiple takes in rapid sequence, enabling actors to work uninterrupted.
But real-time workflows are not limited to high-budget feature films. Included in Maya since the 2011 release, Viewport 2.0 introduced artists to a ‘what you see is what you get’ notion of preview rendering. This harnesses the processing power and on-board RAM of professional graphics cards like those of NVIDIA’s Quadro series to display even large production assets in real time. Since its introduction, Autodesk has steadily been expanding the list of features Viewport 2.0 supports, which now includes ambient occlusion, antialiasing, particles, nHair, nCloth and Fluid Effects. With such sophisticated visual feedback, artists no longer need break their creative flow by waiting for offline renders.
Another tool embracing the power of NVIDIA’s graphics cards to preview movie assets accurately in real time is Mari, The Foundry’s texturing software. Originally developed at Weta Digital during the making of Avatar, Mari enables artists to paint directly onto 3D models, even those running to millions of polygons and hundreds of 4K textures. This workflow minimises the time that artists spend wrangling technical issues and maximises the time they spend actually creating textures.
Learning from games
Another important trend in virtual production is the incorporation of game engines into film-making tools. These engines allow artists to manipulate ever-more complex assets without having to wait for offline renders to check their work.
A key player in this sector of the market is Crytek, which has been developing its own in-house game engine, CryEngine, for over a decade. The studio recently launched Cinebox, a standalone version of CryEngine with a focus on cinematography: a combination of tools that Crytek dubs a ‘film engine’. Cinebox provides preview real-time motion-capture data within a high-quality rendered environment to give artists a much more accurate idea of what finished projects will look like.
Using a virtual camera in conjunction with the software, directors can see their actors and environments in real time, enabling them to find the perfect camera moves: either while shooting live on a motion capture stage, or while navigating shots that have already been recorded.
Crytek has used Cinebox on its recent game Ryse: Son of Rome to help with viewing virtual sets and on-set performance capture. Outside of the studio, the technology is in use at leading UK motion-capture facility The Imaginarium, and was also used on the movies Dawn of the Planet of the Apes and The Maze Runner by LA-based firm CNCPT.
Crytek used Cinebox on the game Ryse: Son of Rome, to help with on-set performance capture and virtual set viewing
What you see is what you play
“When we started CryEngine in 2001, there wasn’t anything out there that enabled us to design worlds where you could jump into that world with the click of a button,” says Cevat Yerli, co-founder, CEO and president of Crytek.
With CryEngine, Crytek developed a concept called ‘What You See Is What You Play’. “We introduced the concept of 3D editing, which focused on the users’ point of view; how they would see the world,” says Yerli. “One example of the benefits of these new ways of working is the launch of Far Cry in March 2004. It took only three years [to develop], from the first ideas to the street date. That was already impressive by itself, but we also had a team – myself included – who had never made a game before.”
“We are now in our fifth iteration of these pipelines and we are constantly improving to maximize our results,” continues Yerli. “Real-time WYSIWYG [workflows] are transformative and, most importantly, accelerate learning. You can ‘fail’ and learn faster as you see results immediately.”
“With Cinebox, we are taking steps to push real-time CG to new heights and to offer one solution for both game and film makers with the goal of creating better worlds” Cevat Yerli, Crytek
Yerli believes that the convergence of game and film-making tools has much to offer artists working in both industries. “Once the film industry started to use motion capture – and later introduced the idea of performance capture, virtual production and other derivative techniques – we were able to do pretty much the same. With Cinebox … we are pushing real-time CG to new heights… to offer one solution for both game and film makers with the goal of [creating] better worlds and better drama.”
A wire to Jim Cameron's brain
From new tools like Cinebox to established packages like Maya and MotionBuilder, real-time technologies are playing an increasingly important role in visual effects production. Thanks to the work of software developers like Crytek and Autodesk, and the increasing power of graphics hardware from manufacturers like HP and NVIDIA, artists can work faster and see results quicker than ever before. Such workflows break down barriers to creativity and empower directors to work at the speed of the imagination.
As Godzilla director Gareth Edwards put it in an interview with the Los Angeles Times recently: “Filmmaking sometimes seems like punishment, but then the movie turns up at the end, and you go, ‘Oh, yeah, that’s the reward.’ I wish you could just plug a wire into your brain for two hours and press record. I think James Cameron is working on that.”
Read more in the Graphic Master Series.