Search
Start typing to search...

AR workflow in Aximmetry DE

Author:

Back to Introduction to AR Production

Introduction

AR or Augmented Reality (also referred to as XR), in virtual production, is the technique of compositing the studio camera's image with virtual objects, so that the final image looks realistic, or in other words, placing non-real, virtual objects in a real environment.


In this image, the office space is real, but the pie chart is a 3D virtual model.

Pre-requisites

This document describes how to set up augmented reality in Aximmetry DE. The material assumes a basic knowledge of Aximmetry DE and how to work with tracked cameras – If you are not familiar with these, please read How to install and work with the Unreal Engine based DE edition and Setting Up Virtual Sets with Tracked Cameras first.

Unreal Engine Scene Settings

  • Open your Unreal scene, or create a new one.

Make sure that the below parameters are set:

  • Set FloatRGBA Pixel Format
    • Go to Edit / Project Settings / Engine / Rendering / Postprocessing
    • Set Frame Buffer Pixel Format to FloatRGBA
  • Enable custom stencil
    • Go to Edit / Project Settings / Engine / Rendering / Postprocessing
    • Set Custom Depth-Stencil Pass to Enabled with Stencil
  • Enable global clip plane
    • Go to Edit / Project Settings / Engine / Rendering / Lighting
    • Tick the Support global clip plane for Planar Reflections checkbox
  • Disable Motion Blur
    • Go to Edit / Project Settings / Engine / Rendering / Default Settings
    • Untick Motion Blur
  • Allow Through Tonemapper
    • Go to Edit / Project Settings / Engine / Rendering / Postprocessing
    • Set Enable alpha channel support in post-processing to Allow Through Tonemapper.
  • Add an AR-tracked camera. Aximmetry / Add Camera / AR with tracked camera

  • Optional: Place shadow catchers (Content Browser: Content / All / Aximmetry_TrackedCam_AR / Blueprints /Aximmetry_Shadow_Catcher). There is no restriction on the number of shadow catchers.
  • Optional: Place reflection catchers ( Content Browser: Content / All / Aximmetry_TrackedCam_AR / Blueprints / Aximmetry_Reflection_Catcher). The maximum number of reflection catchers is 3. Keep in mind that reflections have a very high performance cost.
  • Place virtual objects in the scene and tag them with the “AximmetryAR” tag. Only tagged objects will show up in the final image

Per Object Settings

Please make sure the tag is added to the Actor / Tags array and not the Tags / Component Tags array.

Aximmetry Setup

  • Create a new compound
  • Drag in the .uproject file from your Unreal project root directory
  • Check if the Connection pin value of the Unreal Project module is set to Cooked (this is the default). If not, then set it to Cooked.


  • Drag an AR camera module into the flow graph from
    • Add the [Common_Studio]:Camera\ARCam_Unreal\ARCam_Unreal_3-Cam.xcomp compound to your project.
  • Connect the following pins (see screenshot below):
    • Unreal Project module: Out → TrackedCam_AR: Rendered
    • Unreal Project module: Alpha→ TrackedCam_AR: Alpha
    • Unreal Project module: Reflection 1, 2, 3 → TrackedCam_AR: Reflection 1, 2, 3
    • Unreal Project module: Shadow → TrackedCam_AR: Shadow
    • Unreal Project module: Shadow Clean Plate → TrackedCam_AR: Shadow Clean Plate
    • TrackedCam_AR: Out Size→ Unreal Project module: Out Size
    • TrackedCam_AR: Control Data→ Unreal Project module: Control Data
    • TrackedCam_AR: Preview → Compound output #1
    • TrackedCam_AR: Out → Compound output #2
  • Add tracked camera input. Please refer to the  documentation on how to do this.

  • In the INPUTS control board, select the SCENE panel and use the Base Cam Transf to position the virtual objects to the right place on the tracked camera image
    NOTE: You can learn more about Camera and Head Transformations here.
  • In the INPUTS control board, select the CATCHERS panel and configure the Reflection Blur, Reflection Strength, and Shadow Strength parameters to match the properties of the real surface to achieve a realistic result

Glares

Additional video post-process effects can be set up in the Glares panel.

These glare options are only possible in AR camera compounds. This is because in order to have a better composite of the reflections, shadows, background, and virtual graphics, the tone mapping is done in Aximmetry instead of Unreal. 
NOTE: Due to Unreal's tone mapping being skipped, Unreal's bloom, exposure, chromatic aberration, dirt mask, color grading, and film post-process settings won't affect Aximmetry's AR Unreal camera.

Dynamically Spawned Models and Actors

Objects created after the Unreal scene started, need special blueprint logic to register the “AximmetryAR” tag and be visible in Aximmetry.

After spawning the object, you need to call the Aximmetry_AR_Tracked_Camera's Update ARObjects action.

NOTE: You can easily do this from a Level Blueprint. In other blueprints, you will have to search for the Aximmetry AR Camera.
NOTE: Update ARObjects is a resource-extensive action, try not to use it at every frame.

When using the Update ARObjects action, your actor must have the AximmetryAR tag.
You can add this tag even in the blueprint. First, add a Set Tags action and then connect it to a Make Array action:

Article content

Loading
Close
Loading spinner icon
1/10