A technical look into Nitro Casino’s latest TVC

2020… a year of change to put it lightly. The social distancing measures imposed brought many industries to a halt, & whilst the use of Visual Effects (VFX) techniques within productions has been commonplace for decades, the effects of the pandemic have accelerated not only the utilisation of VFX, but virtual production as a whole. In fact it’s pushed many to step up & innovate. Video game industry technology is being integrated into big-budget movie workflows & that’s a testament to just how photorealistic technology has become. From enabling the scaling back of background talent on set to developing an entire concept in a virtual environment, the industry has leveraged technology to their advantage. LED-Wall Virtual Production alone is completely revolutionising how movies are made & with the ever-decreasing costs of LED screens, this technology will become even more commonplace over the next years.

 

Generally speaking, we’re seeing that the view on remote working has shifted from being something that’s possible, to something that’s productive. It benefits employees & employers alike through increased well being, productivity & potentially reduced costs.

Bringing it closer to home, we used our boutique setup to our advantage. Within days of Betpoint Group approaching us to assist them with their upcoming TVC for their Nitro Casino brand, we had the go-ahead for the project & the ball rolling on pre-production. Physically manufacturing the set needed would have been unfeasible on a number of levels so VFX was the natural choice & opting for Rec Media Room in Mriehel as our filming location was a no-brainer given the size & range of options available within their cyclorama studio.

Everything was shot on green screen over the course of 1 day & with minimal crew size. Given our protagonist was wearing a blacked-out helmet, that enabled us to dub dialogue in post-production, therefore reducing the need for a sound engineer on this particular shoot. Post-Production would be mainly carried out in Blender & its Eevee real-time renderer, therefore we logged all the shot information to assist us with compositing the shots. This was everything from camera height to aperture values.

 

Back in the studio, we started with creating the 3D environment that our character would be interacting with. Since all of Eevee’s effects are screen-based, we used an irradiance volume to calculate & bake all the lighting and reflection information. (Sidebar; an irradiance volume basically emulates ray tracing but in a much less CPU intensive manner) RGB & zDepth passes were rendered to place 2D objects within a 3D space. In this case, these were elements such as mists & atmospheres. The zDepth pass was utilized in After Effects (AE) to create the depth of field data. We then moved on to the process of compositing, keying (within AE), spill suppression & generally cleaning up the footage. This granular control of the footage again allowed us greater control in the placement of the character within the 3D environment.

One of the client’s requests was that the Nitro Casino logo be shown prominently on the jacket. Given the relatively short lead time prior to filming, manufacturing one was not possible. To this end, we utilized the renowned Mocha plugin within AE to track the logo onto the jacket. Trough the use of the puppet tool, we created a deform mesh & animated it using corner pin data. Using this approach, we ended up with a result that appears to track but also deform with the character’s movements. We also cut out the zippers & masked around the stitching to give it more realism.

 

Blender’s camera tracking capabilities, thanks to the match move function & shot details we logged on-set, sped up the process significantly as it automatically calculates the scale of the scene, lens distortion, etc. Whilst the tracking markers do not have to necessarily have to be a specific shape or placed at set distances, in this case we opted to do so to be able to manually verify the scaling that Blender calculated.

Finally, all the individual scenes were imported into After Effects for general colour correction, matching of black levels, motion blur, grain along with other final touches, with sound design being carried out within Ableton Live. Throughout the entire process, even as early as during pre-production, the use of the Frame.io service immensely facilitated both internal & external collaboration & communication, particularly with their advanced app integrations.

 

Given the progressions being made, it’s rather clear that virtual production will not only be a responsible way to do things, but also a far more efficient one for a wide range of applications & with the capabilities to unlock new creative potential.

Interested in discussing how we can collaborate together on our next project?

LET'S TALK

Leave a Reply