VFX VAM Summit: Accelerating the adoption of new-age tech -

VAM Summit: Accelerating the adoption of new-age tech

Technology is the game changer of every sector. Whenever a new technology arrives there has been a paradigm shift noticed by industry experts and VFX is no exception.There has been an amazing curve of new developments happening and how they are affecting or allowing the industry to accelerate  the production. At the VAM Summit Day 2 there was a panel discussion on ‘Accelerating The Adoption Of New-Age Tech’, where the speakers shed light on how the right kind of new age technology has  amplified the content production. The eminent speakers of the session were Famous Studios creative head FHOA Jayant Hadke, phantomFX technology lead Vinay Gedan, DNEG Mumbai head of VFX Jigesh Gajjar and virtual production and VFX supervisor Scott Squires. The session will be moderated by DNEG VFX supervisor Tim McGovern.

Since the work from home culture has become a new norm at the VFX facility, a lot of productions have been happening virtually. McGovern shared, “In the early virtual production years, for the project Avatar in 2009, it was a tool for behind the camera. Nobody could use the imagery. We are not actually seeing through the camera here but we were seeing the process. This didn’t drive realtime rendering as it hadn’t happened yet. It was a great way to integrate motion capture performances and see how they worked inside the shot which may have thousands of people.”

He further added, “Moving on, 2017-2018 they developed half cylinder LED wall in First Man which was the first big walls of LED screens. The imagery had to be rendered at a very high resolution and for this it  was pre-rendered material. Various set pieces were put in front of the screen with the screen being put out of the window and reflected the object.”

He further shared an example of how they captured the atmosphere of the earth and the capsule. How the light coming through the capsule window reflects the objects inside the capsule where the astronauts are sitting inside the capsule with helmet and suit on. ”Honestly green screen seems like stone age, where you can actually light the actors and the set pieces with the actual thing that would have been there. Therefore these things had to be done in advance,” he added.

Next he went through the latest version which is the real time rendering material where Squires shared, “The whole point with virtual production is that you are not creating a background but you are also creating the light. Correct lighting on the character whatever you have in front and you get reflections. It doesn’t seem to look like a still or taken image, you actually have a whole 3D background so that the camera can move when everything moves.”

Squires shared the example of Mandalorian where they had a very large set and in that case they could switch backgrounds easily but for that one needed to have time to build all of these backgrounds in Unreal or some other 3D tool beforehand. He further shared insights on the camera tracking, art direction, pre-viz and so on. He said that when you have to shoot in with the floor and you have to blend in with the virtual floor, there’s a challenge blending the colour and texturing the shots compared to sky as the top screen tends to be lower resolution with less elements. He also said how accent lights are used despite LED screens at the back. He also further shared how various camera view frustum delivers various final compositing results of and on the object.

Further, Gajjar shared how despite working remotely, technology helped to scale up the production for DNEG studio. “We are doing various different things developing the virtual production pipeline. Recently DNEG joined forces with Dimension Studio, where we are doing lot of the LED work with them. For a lot of the shows that we are currently working on, we are partnering with them for content, they are servicing the LED walls for independent as well as the DNEG projects. The latest one that we have worked on is the Matrix. The dojo sequence was entirely done on Unreal and lot of our teams here are working with Epic.”

Since Famous Studios worked on many of the commercial markets therefore they could experiment with a lot of interesting projects. According to Hadke last year they were the first ones to do virtual LED production in their studio. They set up a LED wall for two months. It was like a pop up lap. They used the LED screen for two months to shoot multiple projects and to experiment. They called many directors, DOPs to utilise it for various shots and it became an eye opener for the studios on how this new technology can add value to projects. They have done a decent amount of work using Unreal, like recently they delivered the Dhoni commercial for Unacademy. 

Hadke shared, “All the backdrops were created in Unreal and without Unreal it would have been difficult for us. In the Dhoni commercial we have used various backgrounds and the whole thing was ready in real time.” They have also set a chroma floor and they are experimenting with it in various projects .

Next the panellists spoke about machine learning and artificial intelligence. Squires shared, “The key things over the last 20 years is the incredible amount of advancement in vision, algorithms that combined with computer power enabled us to do strategic things like stitching photos together and doing photogrammetry where we take multiple images that computer calculates every lineup of the images and calculates it in 3D. We achieved all of that and we are still working on that now since we have started enabling AI and machine learning.”

Today the phone we have is quite a bit of machine learning. The neural network that we are on which includes facial recognition, photographs and so on which are part of AI and machine learning. He further shared how machine learning models apply pose recognition to a large database of photos, identifying which way the person is facing and some characteristics of  expressions. In fact one can calculate match moving things frame by frame and this way you can motion capture without any markers and cameras in some cases. In fact with just a photograph, a computer can create an avatar or digital double of something. He also shared that nuke machine learning is the powerful node-based compositing tool at the heart of the Nuke family.

Gedan added, “In terms of machine learning, we have been trying to create digital humans from photographs. It’s a two part thing: it reduces manpower to counts of people that the background is required for and the second thing is licensing those images. We are going to create virtual people out of those images that actually helps us in populating any kind of environment and this been done with the help of machine learning. It consumes a lot of resources to consume one human. But at the end of the day it saves a lot of man hours and directing someone to do something for a particular duration of time.”

Overall the discussion was quite insightful to know the benefits of the new age techs and its scalability prospects. We hope it will surely help the VFX industry to prosper going forward.