What does the role of the Cinematographer play in Virtual Production?

Dr. Jodi Nelson-Tabor
12 min readJun 21, 2021

Written by Peter Bathurst

Virtual Production (VP) is a hot topic right now and one that seems to have captured the imagination of the film industry especially since the release of The Mandalorian.

The objective here is to explore and explain some of the basic tenets of the process with particular focus on my recent experience as Director of Photography on the University of Greenwich/Mo-Sys collaboration How To Be Good (H2BG) which was funded by StoryFutures Academy and written, co-directed and produced by Dr. Jodi Nelson-Tabor.

Each and every production comes with its own unique requirements and challenges and these frame both the approach to and the ultimate outcome of the project. In this case the film itself was being created as a device to explore the learning processes attendant to the VP process. It was funded with a view to establishing pathways for learning for those aiming to join the legion of workers needed to satisfy the demand for this exciting new approach to production.

Peter Bathurst, DoP sets up the shot for lead actress, Grace Enever on the set

Volume LED vs Green Screen

In order to make the most of the resources at our disposal and deliver the film within the timeframe we decided early to use the green screen studio at Stockwell Street Studios at the University of Greenwich. The main impact of this (largely budget based) decision was to face our process away from ‘Volume LED’ screen-based capture and aim us towards the ‘hybrid’ style of VP capture instead.

Volume LED uses large video walls to project the computer-generated virtual world (most often created using Epic’s Unreal Engine). The easiest comparison is to historic back projection methods which place a moving backdrop at the back of a scene and allow live action to be recorded in front of it.

ARRI Volume LED studios, London

The breakthrough is that the ‘backdrop’ can now respond intuitively to the movement of the camera and can be adapted and adjusted to reflect the needs of the script and the scene. The optimum result of this is that the virtual elements blend seamlessly with the real elements creating an unprecedented visual fidelity both on set and beyond.

It’s the real-time rendering of the virtual image that is the real game changer and the fact that the lighting that originates in the CG world shines out from the LED screens and reflects upon the actors and objects within the real set. This is an exciting development from the perspective of cinematographers and image originators and adds a welcome new sense of the environment to performers too.

It also means that unlike traditional green screen capture you no longer must second guess the lighting of the CG world and how it might interact with the action. It also means that rather than having to try to negate the threat of ‘spill’ (where reflected light the colour of the chosen chroma key) will undermine the integrity of the key that can be pulled and result in unwanted interference or gaps in the recorded image.

Part of what makes the idea so compelling is that all those involved in the production can see the world they are interacting with instead of being forced to imagine what will be happening on the green screen later.

Jodi Nelson-Tabor, Writer/Producer/Co-Director demonstrates to the cast and crew what the final shot will look like in the immediate playback on the green screen set.

The holy grail of this kind of VP capture is ‘Final Pixel’ where all the information recorded on set is of a sufficient standard to be treated as the final output.

Whilst LED Volume capture is undoubtedly impressive it still comes with its own inherent technical challenges such as moiré effect and pixel pitch, but that discussion is beyond the scope of this article.

In the hybrid green screen version of VP that we were testing for H2BG, there was the technological leap for us to be able to design and then composite the virtual world with the real world capture using the Mo-Sys camera tracking (StarTracker) system. The live composite could then be viewed on monitors and recorded simultaneously.

It’s hard to make a case that this is as impressive as the Volume LED version, but it is still a significant step forward from traditional green screen capture. Director, Actors and HoD’s (Heads of Department) can still see and work with the ‘live’ background and make informed decisions about how performance, interactions and lighting will blend together to create a compelling narrative.

Mo-Sys’ Juliette Thymi running their Ultimatte and StarTracker system set up on location

In conversation with Mo-Sys we made a few key technical decisions to make sure we could create the film as scripted in the tight, 3 shooting days available. One of the key objectives for me was to be able to move the camera in shot and showcase the way the motion tracking system would interact with the frustum of the virtual camera.

Peter Bathurst, DoP works with students who assisted on crew

We wanted to work with a small (Covid friendly) crew, and we wanted to minimise the changing from one camera rig to another. We opted for a Ronin 2 rig as a quick and effective way of developing shots in camera and adjusting our position according to narrative and physical requirements.

We were also encouraged by Mo-Sys to shoot with zoom lenses instead of primes to minimise down time needed for lens challenges and calibrations. We settled on 2 matching Arri Alura zooms (15.5mm-45mm and 30mm-80mm) and sent them off to Mo-Sys the week before so they could be calibrated to work with their StarTracker camera tracking system.

There were, however, concerns over the payload of our set up on the Ronin 2. Red Epic Weapon, Zoom Lens, StarTracker, Wireless Focus Control, Battery, etc. (not to mention the loom of cables needed to bring the image to Mo-Sys’s StarTracker Studio System, which offered the brain behind our capture system) all add up in terms of weight, but my key collaborator, Malgorzata Pronko worked tirelessly to tune the rig and maintain a functional balance for the Ronin to deliver what we needed.

Malgorzata Pronko sets up the Ronin 2 Rig with the StarTracker system with masters students assisting

Previs & Unreal Engine

But before getting into the nitty gritty of the shoot itself it’s worth quickly discussing the pre-production process. Across the industry this area has developed hugely in terms of sophistication and significance over recent years as digital previsualisation techniques (i.e. previs) have gone from strength to strength. In the world of Covid, so much of what would ‘normally’ have taken place in the physical realm has been forced online. VP is a ready home for these developments. Due to the nature of ‘world building’ in Unreal most, if not all the typical location facing conversations and scouting can be moved online.

In our case, we worked with an off the shelf Unreal world (Unreal Marketplace) and Jodi adapted the original script to suit and change locations and script elements to fit the parameters of our visual assets available. This preserved the integrity of the drama, but moved us swiftly to a place where we had our virtual locations ready to shoot in a matter of a few weeks.

I was struck by how many of the real-life tasks usually involved in pre-production simply became virtual. Over a few sessions on Teams, we collaborated with our (wonderfully patient and talented) Unreal Developer — Dr. Drew MacQuarrie, who also became the virtual location manager to scout the spaces (digital assets) we had chosen on the marketplace for our shoot. From there, we would work out where we would put the camera and lighting in the digital realm, and thus envision what we would see as a result.

The HoDs collaborate remotely via Teams with Drew MacQuarrie, who demonstrates in Unreal Engine one of the scenes in the film

Cinematography & Virtual Production

It’s at this point the process starts to come into its own in terms of cinematography within the virtual world. By dropping a virtual character into the space, you can start to plot and plan how shots will look and what action will develop by using the virtual camera and lenses in Unreal. In doing so, you begin to understand which parts of the scene will be fully virtual and which will still need to be real. Interactions with real objects, however are still central to the success of VP — as without the real-world props and foreground to ground what the audience sees, can quickly appear as merely an elaborate virtual backdrop.

Once the virtual storyboarding was completed, all HoD’s had a handle on what they needed to prepare. For Drew, he now knew where to focus his attentions, in terms of developing the fidelity of our world and creating a look appropriate to the piece. That meant, for example changing the original scene from day for night, dropping in virtual fencing to match our real fencing prop, taking out virtual props and elements like water splashes, oil drums, and barricades which were part of the virtual asset and changing out the train wagon to fit with the narrative action we needed to execute the story.

The finished storyboards, printed out poster-size for visual aid on set for all HoDs to collaborate and execute (this was planned for and discussed during previs, which is crucial to principal photography)

There is vast opportunity here to develop the lighting within the virtual world and even to use plug-ins that will replicate real world lighting fixtures so that it is possible to know that the lighting in the real world will match what you have prepared in the virtual world. With time and budget, it is now also possible to create sophisticated lighting that can add compelling levels of depth and detail to the Unreal environment and harmonise and blend with the scene you stage for real. Again, much like the approach to lighting and staging on any real-world location.

VP Production Process

Our production week began with a prep day for Mo-Sys, where they set up and attached motion tracking stickers to the studio ceiling and lighting rig to prepare their system to map the space. Day 2 gave us a pre-light and a chance to confirm how the lighting would work in terms of pulling our key. The script is set at night and the main source for lighting within the Unreal world being ambient moonlight and so we needed to continue that theme on our set. I opted to risk turning our soft box top light blue with ½ CTB gels and hoped that this wouldn’t affect the purity of the key.

Mo-Sys putting up the ‘stars’ stickers for the StarTracking system

Overall, the idea was a success, but even with the sophistication of the Ultimatte keyer and the good work from Mo-Sys we still struggled where the green vinyl floor had been taped together with apparently ‘green’ tape. Green but not the same green as the rest of the floor.

Checking the location of the ‘stars’, where the visible ‘green’ tape is located across the floor

It was a worthwhile process to have the key being pulled live so there was a constant flow of conversation and adjustment of levels from both the camera side and the Mo-Sys side to get the best results. We were also fortunate to have our editor (Walter Stabb), VFX supervisor (Richie Oldfield) and Drew MacQuarrie on set, as this meant it could be decided exactly how best to solve (and thus prevent) the many small challenges that we came across as we went along that ultimately would have to be fixed in post.

Foremost of these for me was the difficulty we had on our first shoot day of using real wire fencing panels to sit alongside virtual ones as our main character Lily squeezes through them and into the industrial wasteland. The wire of the fencing wasn’t an especially fine gauge but when shot against the green the mesh largely disappeared and thus made the idea of having to squeeze through it seem unnecessary. One of the answers was to pour light onto the mesh from the back and side (mimicking our virtual sodium streetlight) to make the fence visible again but this not only caused unwanted shadows on the actor but spilled onto the green and effected the key.

Lead actress, Grace Enever walking through the scene where she has to cross through the real fencing.
In playback, the same scene, but as shown in Unreal Engine with the virtual background.

The plus side was that we were all able to view the live composite and consider which of the many elements in play we could adjust to achieve the desired results, but the ultimate frustration of shooting against green was still apparent. Although it’s interesting to question if the mesh would still have been problematic (moire effect) against LED screens.

Flipping the set

The shoot itself basically boiled down to a series of vignettes, which set up real world props and set dressing against the Unreal background and helped us on our journey. One of the ongoing demands of shooting in this way is that when approaching reverse shots within a scene instead of turning the camera around and shooting back the other way in physical space you keep the camera in the same area and switch the Unreal world instead and flip the lighting through 180 degrees to make the shots match.

Lead actor, Connor Creighton in his ‘home’ for one of the scenes, which had to be rotated an entire 360 degrees to get the reverse shots of the other actor in the scene (instead of the camera moving to location)

In this sense we were fortunate to have a small lighting package and only a few additional elements beyond our ambient moonlight in play and so flipping was straightforward.

Another interesting challenge that presented itself was when we came to shoot the scene in the train wagon. The plan here was to only use torchlight (both virtual and real) to explore the characters discovering each other and making their connection, but of course without lights on there is no chance of pulling a key and with no key there is no way to generate the virtual world in the shot.

Lead actress, Grace Enever inside the train wagon during a scene where the only light source is to be a torch.

This meant we fell back on tried and tested film methods where we had to ‘cheat’ the darkness and stylise the spaces to allow the drama to work but also allow the tech to work as it needs to. We’d built a couple of Dedo lights into our virtual set so we manipulated them and allowed some light to spill down the insides of the wagon and give us something to work with. In the end we could still pull a clean key and the overall effect worked pretty well. The Mo-Sys StarTracker Studio system was impressive overall and despite the ungainly and surprisingly heavy rig the real and virtual cameras meshed well.

One of the pleasures of the virtual backdrop was that we could cheat depth of field to get the visible results that suited us best. Even if the real-world lens was set at T2.8, we could set the virtual lens to say — T1.5 to throw the background out of focus more and emphasise that fall off.

H2BG was a great way of beginning the journey towards understanding VP and the associated learning attendant in the process. Ultimately, in this hybrid green screen format it is still very much like an elaborate green screen capture, but with all the benefits of live compositing and being able to understand the requirements of the background world both before and during the shooting itself.

The really exciting takeaway is the handshake between the real and the Unreal and with time and budget being able to design both elements to work as one. As with traditional process, the more creative attention you can give to the process the better the results will be on the day. Without question this further enhances the relationship between cinematographer and post-production and makes for a much more holistic approach to the overall shooting process which can only be welcomed.

Peter Bathurst is a cinematographer, writer, producer, director and educator working across traditional formats and increasingly in advanced technologies including 360 capture, immersive and VR.

A team of filmmakers and academics at the University of Greenwich have created a micro-short film entitled, How To Be Good, in collaboration with industry leaders at Storyfutures Academy and Mo-Sys engineering to explore and document workflows in virtual production. In this first article of a series, principle investigator Dr Jodi Nelson-Tabor discusses what virtual production means in different contexts and how producing How To Be Good sheds an important light on how VP can be a managed and harnessed to create films that would otherwise be cost prohibitive and complex to shoot.

To follow the series, click on the following: 1, 2, 3, 4, 5, 6, 7, 8, 9

Follow the series/thread at #H2BG on Twitter @GREdesignSchool.

--

--

Dr. Jodi Nelson-Tabor

Dr Jodi Nelson-Tabor is the Business Development and Training Manager for Final Pixel.