Back to Introduction to AR Production
AR or Augmented Reality (also referred to as XR), in virtual production, is the technique of compositing the studio camera's image with virtual objects, so that the final image looks realistic, or in other words, placing non-real, virtual objects in a real environment.
In this image, the office space is real, but the pie chart is a 3D virtual model.
This document describes how to set up augmented reality in Aximmetry DE. The material assumes a basic knowledge of Aximmetry DE and how to work with tracked cameras – If you are not familiar with these, please read How to install and work with the Unreal Engine based DE edition and Setting Up Virtual Sets with Tracked Cameras first.
Unreal Engine scene settings
- Open your Unreal scene, or create a new one.
Make sure that the below parameters are set:
- Enable custom stencil
- Go to Edit / Project Settings / Engine / Rendering / Postprocessing
- Set Custom Depth-Stencil Pass to Enabled with Stencil
- Enable global clip plane
- Go to Edit / Project Settings / Engine / Rendering / Lighting
- Tick the Support global clip plane for Planar Reflections checkbox
- Disable Motion Blur
- Go to Edit / Project Settings / Engine / Rendering / Default Settings
- Untick Motion Blur
- Add an AR-tracked camera. Aximmetry / Add Camera / AR with tracked camera
- Optional: Place shadow catchers (Content Browser: Content / All / Aximmetry_TrackedCam_AR / Blueprints /Aximmetry_Shadow_Catcher). There is no restriction on the number of shadow catchers.
- Optional: Place reflection catchers ( Content Browser: Content / All / Aximmetry_TrackedCam_AR / Blueprints / Aximmetry_Reflection_Catcher). The maximum number of reflection catchers is 3. Keep in mind that reflections have a very high-performance cost.
- Place virtual objects in the scene and tag them with the “AximmetryAR” tag. Only tagged objects will show up in the final image
Per object settings
Please make sure the tag is added to the Actor / Tags array and not the Tags / Component Tags array.
- If you want to use the AR camera in Live Link mode, set the resolution as detailed in the tracked camera article
- Set up the startup maps, save and cook the project as described in the How to install and work with the Unreal Engine based DE edition document.
- Create a new compound
- Drag in the .uproject file from your Unreal project root directory
- Check if the Connection pin value of the Unreal Project module is set to Cooked (this is the default). If not, then set it to Cooked.
- Drag an AR camera module into the flow graph from
- Add the [Common_Studio]:Compounds\TrackedCam_Unreal\TrackedCam_AR_Unreal_Prev_3-Cam.xcomp compound to your project.
- Connect the following pins (see screenshot below):
- Unreal Project module: Out → TrackedCam_AR: Rendered
- Unreal Project module: Alpha→ TrackedCam_AR: Alpha
- Unreal Project module: Reflection 1, 2, 3 → TrackedCam_AR: Reflection 1, 2, 3
- Unreal Project module: Shadow → TrackedCam_AR: Shadow
- Unreal Project module: Shadow Clean Plate → TrackedCam_AR: Shadow Clean Plate
- TrackedCam_AR: Out Size→ Unreal Project module: Out Size
- TrackedCam_AR: Control Data→ Unreal Project module: Control Data
- TrackedCam_AR: Preview → Compound output #1
- TrackedCam_AR: Out → Compound output #2
- Add tracked camera input. Please refer to the Setting Up Virtual Sets with Tracked Cameras documentation on how to do this.
- In the INPUTS control board, select the SCENE panel and use the Base Cam Transf to position the virtual objects to the right place on the tracked camera image
- In the INPUTS control board, select the CATCHERS panel and configure the Reflection Blur, Reflection Strength, and Shadow Strength parameters to match the properties of the real surface to achieve a realistic result