Virtual Production at Model Scale

The Virtual Production at Model Scale (VPaMS) research and development project at Aardman Animations was formed with the goal of exploring how virtual production (VP) technologies can fit within a stop-motion context.

The emerging industry of VP is best known for its fancy LED volumes and motion capture suits, but at VPaMS we expanded our horizons to engage with anywhere where the physical and digital worlds blend on a film set.

Facial hair, Smile, Happiness, Logo
Electronic device, Studio, Tripod

VPaMS consisted of multiple projects, each tackling a different problem-set faced in the stop-motion world. Each project begun with identifying key stakeholders within Aardman, and then conducting research to understand the approach we should take. All the projects were designed to be adaptive, and so testing with stakeholders throughout and adjusting to their feedback was essential.

My role on the project was to lead the development of the two main mixed reality tools: 'Puppet Sculpt Viewer' and 'Blockout'. We also visited the new LED volume stage at MyWorld, Bristol to experiment with shooting stop-frame on an LED wall. Continue reading below for details on each project...

Puppet Sculpt Viewer Tool

The Puppet Sculpt tool allows an artist or director to view a digitally sculpted puppet in VR in order to review the asset in a more natural way than on a traditional screen. This was in direct response to feedback from the puppet department at Aardman that directors were finding it harder to engage with the digital puppet sculpts, usually only spotting mistakes once the sculpts were 3D printed - a task that would typically take several hours.

In order to cut down on the number of 3D print iterations, it was asked whether viewing the digital sculpts in VR could give the directors the context they needed to spot mistakes.

The resulting tool did just that, as well as giving the user a host of tools to gain more information from the puppets and even review multiple puppets side-by-side for direct comparison. The response from the puppet department was extremely positive, with use-cases being identified almost immediately.

[Project video coming soon...]

The project ran alongside another R&D unit, Digital Twins. Taking sets and props 3D scanned by their team, we integrated both mesh and gaussian splat geometry into Unreal.

A successful use-case of this workflow was scanning 1:5 scale 'whitecard models' produced early on in a production and then enlarging them to full-size in Unreal, enabling directors and DPs to scout a scale model as if it were a real set. Where before DPs would use their phones to try to find shots on the miniature models, they could now accurately frame up with virtual Aardman cameras that had the correct filmbacks, lenses and depth of field.

Overall, this hands-on experience of framing-up and animating within a virtual environment was received very positively by our stakeholders and other Aardman creatives.

It was unanimously favoured over a traditional pre-vis pipeline due to the tool’s tactile nature. It was felt that the outputs from the Blockout tool were much more direct and true to the director's/animator's vision (since they were directing/animating the shots themselves) rather than being lost in translation to the pre-vis artist.

Consequently, Aardman are looking to invest in the tool to get it production proved for the next series and features.

The Blockout Tool

The Blockout tool was in many ways an evolution of Puppet Sculpt. Just like Puppet Sculpt, users could interact with virtual puppets on a virtual table in VR. But this time they could also be on a virtual set, as well as use an iPad as a Virtual Camera (VCam), or even use a real Canon EOS-1DX Aardman camera. And the magic that allowed us to create all of these coherent windows into the same virtual world? Motion tracking.

We went with OptiTrack's motion tracking infrared cameras to give us a shared coordinate space for all the objects operating in the virtual environment.

By placing reflective tracking dots on our cameras, iPad and VR headset we could get their positions in the real world.

We set the tracking origin as the corner of a table, allowing us to then place a virtual table in the same location inside Unreal engine. This table became the anchor point that sets and puppets could be placed upon. And by placing tracking dots on various physical props, we could track their positions too, using them to drive the movement of virtual props (such as puppets) inside Unreal.

To allow creatives to explore the virtual environment inside Unreal we first set up a VR workflow. We wanted VR to work within the Unreal editor rather than in a game instance so that any changes that were made would persist. To accomplish this we turned to Unreal's XR Creative Framework - see the video below for the result.

We were aware that not everyone feels comfortable in VR yet. Consequently, we also turned our efforts to building an iPad based VCam using Unreal's VCam plugin as a framework. Everything was custom-made due to the specific requirements of Aardman cameras and so that the VCam could interact with cameras spawned in VR and vice versa. We also built a suite of stop-motion animation tools using the Sequencer Scripting plugin. See it in action in the video below: