Through The Eyes Of Our Ancestors

Mod was engaged by the Australian Film, Television and Radio School (AFTRS) under a Netflix-funded Indigenous Scholarship Fund to collaborate with Gudang Yadhaykenu First Nations Traditional Owner, Nicholas Thompson-Wymarra to produce a “3D Elders” holographic story pilot. As Nicholas states, "Our Gudang Yadhaykenu Nation Land, Sea and Air is located on the Cape York Peninsula, Top Of Australia. IPI Dreaming with Rich  Ecosystem."  

The project allowed volumetric video and digital human performance-capture techniques to be tested and compared.

In the short term the project aims to empower stakeholders around realtime and virtual production - new methods of recording cultural heritage. The long term goal is to increase digital capacity within a Gudang/Yadhaykenu Smart City framework.

Here is an overview of the key steps:

Scoping

Over several months we listened to Gudang Yadhaykenu traditional owners, and researched and tested methods for how to best achieve their aims for the project. The final scoping report laid out a roadmap for an intensive one-day performance capture session and delivery of a 5 minute story. Recognising the importance of the project, we ended up delivering three stories, totalling 16 minutes of content.

3D Modelling

On production day we began with Avatar Factory creating a full body scan of Nick using their 172 camera rig and delivered a photogrammetry based 3D model of Nick.


The model was later used within the Unreal Engine Mesh to Metahuman pipeline to create a rigged character that was customised in Nick’s likeness.

Volumetric Video Capture

The performances or “the shoot” took place at The Electric Lens Co studio using their 7 camera volumetric video capture array. They delivered a high quality geometry cache animation / video texture pair for each story, suitable for real-time playback in Unreal Engine.

Machine learning processes were used to deliver the final sequence including:

  • 3D scan processing
  • Marker removal
  • Voice isolation
  • Transcripts

Motion Capture

We used Electric Lens Co’s OptiTrack full body motion capture system (13 cameras and suit) together with a FaceGood Head Mounted Camera (HMC) to capture facial motion. OptiTrack markers were placed on Nick’s face to to assist in the tracking process

App Development

An Unreal Engine app was built for desktop or XR viewing and rendering of video versions for Looking Glass Factory Holographic Displays.

For the Holographic Displays we used two Looking Glass Factory models:

  • Looking Glass Factory Portrait (8") holographic display
  • Looking Glass Factory 32’’ holographic display

The project was unveiled at the inaugural SXSW Sydney.

Key Findings

The project provided all stakeholders with valuable experience in the trade-offs, in creative, technical and commercial terms, between two different products: Volumetric video of characters (below left) vs 'digital human' rigged characters (below right).

Each has its unique uses and benefits.

Volumetric video (based on live action cameras) can look incredibly life-like - hence its use in film/TV VFX - but it requires significant amounts of camera footage and processing. Any revised or new performance requires another multi-camera live action shoot and processing of large amounts of new data. The fine detail captured of Nick’s face was possible due to his almost completely static pose on-set. This technique does not lend itself to more dynamic performances without losing fidelity - as the cameras would need to be further back.

A rigged character by contrast - in this case using a custom Epic Games MetaHuman workflow - is essentially a puppet that is readily re-purposed and usable for any new performances captured (possible even with a mobile phone!). That flexibility comes with significant up-front cost to build an avatar with a photorealistic likeness (a labour intensive manual process) and character rig functionality to support the delivery of any performance. In other words, the cost of producing a general purpose puppet capable of delivering any performance of equal fidelity to a video-based likeness is often prohibitive compared to what can be produced to work for a single shot or performance.

Digital human character processes are evolving fast. For this pilot we focused on producing a first iteration rigged character bust (not a full body) that let us identify key issues in the MetaHuman pipeline for further development. These include support for larger body types in the custom MetaHuman process, custom hair groom, blood flow maps and muscle animation.

We were honoured to have had this opportunity - to listen and learn about Gudang Yadhaykenu country and culture, then collaborate on new processes we hope can empower storytellers. Australia needs more First Nations-led virtual production.

Check out some photos from the project on Flickr.

Credits

AFTRS in collaboration with Mod, in collaboration with Gudang Yadhayhenu Tribal Governance Council, and supported by the NETFLIX INDIGENOUS SCHOLARSHIP FUND, present

Through The Eyes of Our Ancestors

Featuring

Nicholas Thompson-Wymarra

Mod

Director / Producer - Michela Ledwidge

Producer - Mish Sparks

Virtual Production Generalist - Sarah Cashman

Senior Developer - Isaac Cooper

Developer - Anthony Johansen-Barr

Stand-in - Tim Gray

BTS Videographer - Aaron Cheater

AFTRS

Director, First Nations and Outreach - Romaine Moreton

Production Manager - Sue Elphinstone

First Nations Community Engagement Manager - George Coles

Gudang Yadhaykenu Tribal Governance Council

Alex Wymarra

Edgar Wymarra

Elizabeth Wymarra

Nicolas Thompson Wymarra

Bryan Wymarra

Elder Aunty Jennifer Jill Wymarra

Elder Aunty Hazel Wymarra

Electric Lens Co

Volumetric Video & Motion Capture - Matthew Hermans

Avatar Factory

Scanning Supervisor - Mark Ruff

Scanning Producer - Kate Ruff

Scanning Wrangler - Chloe Ruff

Thanks

NEP - Nigel Simpson

Looking Glass Factory - Caleb Johnston

Filmed and created on Gadigal Bidjigal Country.