Mod was engaged by the Australian Film, Television and Radio School (AFTRS) under a Netflix-funded Indigenous Scholarship Fund to collaborate with First Nations Traditional Owner, Nicholas Thompson-Wymarra from Gudang Yadhaykenu country on the Cape York Peninsula, the northernmost tip of Australia, to produce a “3D Elders” holographic story pilot. This allowed volumetric video and digital human performance-capture techniques to be tested and compared.
In the short term the project aims to empower stakeholders around realtime and virtual production - new methods of recording cultural heritage. The long term goal is to increase digital capacity within a Gudang/Yadhaykenu Smart City framework.
Here is an overview of the key steps:
Over several months we listened to Gudang Yadhaykenu traditional owners, and researched and tested methods for how to best achieve their aims for the project. The final scoping report laid out a roadmap for an intensive one-day performance capture session and delivery of a 5 minute story. Recognising the importance of the project, we ended up delivering three stories, totalling 16 minutes of content.
On production day we began with Avatar Factory creating a full body scan of Nick using their 172 camera rig and delivered a photogrammetry based 3D model of Nick.
The model was later used within the Unreal Engine Mesh to Metahuman pipeline to create a rigged character that was customised in Nick’s likeness.
The performances or “the shoot” took place at The Electric Lens Co studio using their 7 camera volumetric video capture array. They delivered a high quality geometry cache animation / video texture pair for each story, suitable for real-time playback in Unreal Engine.
Machine learning processes were used to deliver the final sequence including:
We used Electric Lens Co’s OptiTrack full body motion capture system (13 cameras and suit) together with a FaceGood Head Mounted Camera (HMC) to capture facial motion. OptiTrack markers were placed on Nick’s face to to assist in the tracking process
An Unreal Engine app was built for desktop or XR viewing and rendering of video versions for Looking Glass Factory Holographic Displays.
For the Holographic Displays we used two Looking Glass Factory models:
The project was unveiled at the inaugural SXSW Sydney.
The project provided all stakeholders with valuable experience in the trade-offs, in creative, technical and commercial terms, between two different products: Volumetric video of characters (below left) vs 'digital human' rigged characters (below right).
Each has its unique uses and benefits.
Volumetric video (based on live action cameras) can look incredibly life-like - hence its use in film/TV VFX - but it requires significant amounts of camera footage and processing. Any revised or new performance requires another multi-camera live action shoot and processing of large amounts of new data. The fine detail captured of Nick’s face was possible due to his almost completely static pose on-set. This technique does not lend itself to more dynamic performances without losing fidelity - as the cameras would need to be further back.
A rigged character by contrast - in this case using a custom Epic Games MetaHuman workflow - is essentially a puppet that is readily re-purposed and usable for any new performances captured (possible even with a mobile phone!). That flexibility comes with significant up-front cost to build an avatar with a photorealistic likeness (a labour intensive manual process) and character rig functionality to support the delivery of any performance. In other words, the cost of producing a general purpose puppet capable of delivering any performance of equal fidelity to a video-based likeness is often prohibitive compared to what can be produced to work for a single shot or performance.
Digital human character processes are evolving fast. For this pilot we focused on producing a first iteration rigged character bust (not a full body) that let us identify key issues in the MetaHuman pipeline for further development. These include support for larger body types in the custom MetaHuman process, custom hair groom, blood flow maps and muscle animation.
We were honoured to have had this opportunity - to listen and learn about Gudang Yadhaykenu country and culture, then collaborate on new processes we hope can empower storytellers. Australia needs more First Nations-led virtual production.
Check out some photos from the project on Flickr.
AFTRS in collaboration with Mod, in collaboration with Gudang Yadhayhenu Tribal Governance Council, and supported by the NETFLIX INDIGENOUS SCHOLARSHIP FUND, present
Through The Eyes of Our Ancestors
Director / Producer - Michela Ledwidge
Producer - Mish Sparks
Virtual Production Generalist - Sarah Cashman
Senior Developer - Isaac Cooper
Developer - Anthony Johansen-Barr
Stand-in - Tim Gray
BTS Videographer - Aaron Cheater
Director, First Nations and Outreach - Romaine Moreton
Production Manager - Sue Elphinstone
First Nations Community Engagement Manager - George Coles
Gudang Yadhaykenu Tribal Governance Council
Electric Lens Co
Volumetric Video & Motion Capture - Matthew Hermans
Scanning Supervisor - Mark Ruff
Scanning Producer - Kate Ruff
Scanning Wrangler - Chloe Ruff
NEP - Nigel Simpson
Looking Glass Factory - Caleb Johnston
Filmed and created on Gadigal Bidjigal Country.